Steps to build a Revit A.I. Assistant !

Steps to build a Revit A.I. Assistant !

Read in on Medium !

Just 3 months back, our team participated in HackAsia 2020 with Gammon Construction as our challenge partner. While we did not win the competition, it was an extremely satisfying project as the team FINALLY brought an idea (…that was lingering for more than a year ) to life !

We built GamBot 2.0, an A.I. assistant in Revit that works in the background, giving recommendations while users are building models in Revit. Since we only developed a Proof-of-Concept, GamBot’s only ability was to predict airflow requirements of room elements.

There were a number of asks on the methodology, so I thought we should share this knowledge with everyone.

The Proposal

Following was the initial proposal that got us into the doors of the hackathon. Where we proposed a process that closes the loop between past projects and current ongoing projects, with intention to leverage of rich past project data. The process involved :

  1. Extracting data from past models in IFC
  2. Storing it in a database
  3. Using the data to train ML models
  4. Feeding ML outputs to current models in the background

Why IFC ?

Particularly, IFC compliant models. For two main reasons :

i. Semantic meaning

Like any XML schema describing data exchange formats, IFC  represents an open specification for BIM data that is exchanged in a building project. It covers from a very high level, e.g. site information, to a very detailed level, e.g. distinct door specifications.

Structuring data with semantic meaning should be Step 1, before any robust downstream processes could be developed. Without any semantic reference, it’ll be difficult to build scalable cross platform integrations, since no one is ‘speaking the same language’. For instance, Software A could categorise one object as ‘Floor’ but Software B could use ‘Slab’ for the same purpose.

When training Machine Learning models related to BIM, we can imagine that there’s a need to collect data from BIM models. And without adherence to semantic references, no two models are built the same way, and hence, the need for IFC to govern data structure in a BIM model.

ii. OpenBIM format

Files generated from proprietary BIM authoring tools are ‘closed’ formats. Chances are, there a little or no methods to directly parse such file formats into usable data. However, there are multiple known ways we can inspect data from IFC models directly, using open source projects such as xBIM, IfcOpenShell, IfcPlusPlus, etc (Check them out here).

The intent was to have different BIM softwares generating and being able to read the same piece of structured data ! Imagine the possibilities after interoperability issues are no more!

And based off the vision for IFC compliant model data, these data can then be stored in a data warehouse to be used as training data. We then proposed training of modular ML models for different purposes, e.g. cost estimation, design compliance, etc.

Afterwhich, these models could be sitting in a local or cloud server, and local BIM applications would be activating (i.e. querying) these ML models behind the scenes, sending real-time recommendations or predictions to users.

The Hackathon

Now we are done romanticizing with IFC compliant models, here’s what we actually did for the hackathon project. The process was :

  1. Data extraction & preparation
  2. Train ML model
  3. Host model on Azure
  4. Revit plugin to query model

1. Data extraction & preparation

Data & ML enthusiasts will know that the process follows the basic ML pipeline, i.e. data collection , preparation, training, inference.

Initial data extracted.

Preped input data, X.

Preped label data, y.

Split data into training and test sets for accuracy evaluation.

You may have noticed data under ‘modeofventilation’, ‘buildingtype’, ‘level’ headers were encoded as integers. Machine understand numbers, not text, hence discrete string values were categorised as numbers using Categorical Encoding (  LabelEncoder Documentation ).

2. Train ML Model

We used Scikit-Learn to train our ML model, and thanks convenience and accessibility of the package, we were able to test out a few training algorithms, and went with the one with the highest accuracy, i.e. Random Forest Classifier.

Logistics Regression.

Random Forest Classfier.

Support Vector Machines.

Package for Deployment!

3. Host Model on Azure

We aren’t experts of cloud technology, but the amount of guides and resources available online made it relatively simple to find the right approach to deploy the model on the cloud. We went with Azure as it seemed like the guides and documentation available did exactly what we needed (…And using Azure yielded bonus points ! )

Here’s generally what you need to do to deploy the model as an API Endpoint, check out the resources that helped us through the hackathon :

  1. Make a resource group
  2. Define a workspace in that resource group
  3. Register the ML model
  4. Deploy the ML model to a Service

Deploy Model :

Python SDK :

4. Revit plugin to query model

Now, we’re just left with integrating this with Revit !

An important User Experience we wanted to achieve was the ‘feel’ of having an A.I. assistant providing live recommendations to users, and write parameters on the fly. In order to achieve that, these were two key features to implement in the Revit plugin working behind the scenes:

i. Add trigger via Revit API’s Dynamic Model Updater (IUpdater interface)
ii. Query ML model and push results upon trigger

i. Dynamic Model Updater (IUpdater interface)

The IUpdater interface allowed us to register custom triggers, and in our case, specifically triggers when parameter changes in space elements. E.g . from the simplfied code below, the trigger fires, when the defined change type, height, changes for a space element.

using (ElementCategoryFilter ecf = new ElementCategoryFilter(BuiltInCategory.OST_MEPSpaces))
	//. . .
	//. . .
	//. . .
	List<ChangeType> changetypes = new List<ChangeType>()
		Element.GetChangeTypeParameter(new ElementId(BuiltInParameter.ROOM_HEIGHT))
	ChangeType mergedChangeType = null;
	foreach(ChangeType changeType in changetypes)
		if (mergedChangeType == null)
			mergedChangeType = changeType;
			mergedChangeType = ChangeType.ConcatenateChangeTypes(mergedChangeType, changeType);
	UpdaterRegistry.AddTrigger(updater.GetUpdaterId(), _activeDocument, ecf, mergedChangeType);

Upon trigger, allowing us to make transactions in Revit API for writing parameters to space elements and notify users using popup dialogues.

Here are useful references that will guide you through the details of implementing the IUpdater, from registering the Updater to Execution of transactions.

TheBuildingCoder :

Revit API docs :

ii. Query ML model

Before making transactions, we’ll need to get feedback from the deployed ML model, here’s the process that happens upon firing of trigger.

  • Perform REST request on the deployed ML model
  • Deserialize response from ML model endpoint
  • Write parameters to space elements
  • Notify user using popup dialog

A static method for the REST request is as shown below.

internal static Output RequestMLPrediction(Input input)
    //Serialize parameters from Revit into Input object
    //Create new JSON 'obj' variable
	var obj = new { data =  input };

    //Send 'POST' request to endpoint
	var client = new RestClient("http://xxxxx-df17-4826-8419- 
                    "); //dummy url
	client.Timeout = -1;
	var request = new RestRequest(Method.POST);
	request.AddHeader("Content-Type", "application/json");
	IRestResponse response = client.Execute(request);

    //Deserialize JSON response into Output object
	Output pred = JsonConvert.DeserializeObject<Output> (response.Content);
	return pred;

internal class Output
	public string Prediction { get; set; }

	public string Accuracy { get; set; }


Put it all together and we have an A.I Assistant running in Revit !

Hopefully it’d put you in the right direction if you’re building something similar. Questions ? Interested to work with us ? Drop us an email at .

P.S. We’d love to share good news on the launch of our new website for business opportunities. Check out ! Do write in to chat with us! And share it with your colleagues !! Little things like this helps keep us going, and we truly appreciate your help !

Leave a Reply