We have had file based complete single models, then we needed to share other peoples work with in our own models so we had references then we needed to analysis these references together so we got federated models. With me so far. All these scenarios are based on files exchanges. Now we have virtual models these are where data does not lie in a single format but is viewable and editable from different applications but from with the model view, overcoming many issues of data authoring, authority, and qualification.
This is how it works.
Data exists in a 2 states.
- In a file
- in a data service
The model file has references to the data service and then each model object within the model file has references to the objects in the data service. It’s really exciting because at last we can eliminate the problems of import and export with loss of fidelity, data ownership is critical to authenticity and CAD system are just not set up to work like databases.
The current practice being promoted is to use IFC files around the team, but the same problems exist it is not a concurrent working model. IFC is 1st a data model http://en.wikipedia.org/wiki/Data_model and 2nd has a file exchange format. It is becoming common to use BIM objects created by 3rd parties for product manufacturers which contain all the attributes that may or may not be required by the client. There is an expectation that the contractor / installer will complete a lot of these data attributes in a provided model file. IFC models are being loaded into common data environments from which the fields can then be accessed and edited using whatever tools the CDE provides. This means that the only way of accessing the data is through the CDE tools.
As a data model IFC is built for the open data service network and the work currently being done with IFC XML is going to significantly open up the BIM world to the many applications developers using data services at their core.
Data Services are very well established in everyday life and as the name implies it is simply a Service that delivers data on a request. Data can be files, or data from a system such as a back end database. In the 1980’s rapid development of computing threw up headaches for IT departments needing to make different systems talk to each other and with the advent of HTML a data format called XML (Extensible Mark-up Language 1998 http://en.wikipedia.org/wiki/XML) was devised with the goal of simplicity and be both human and machine readable. It is human readable because it is in an ASCII format (American Standard Code for Information Interchange http://en.wikipedia.org/wiki/ASCII 1964/1981 ) it encodes specified characters into 7-bit binary integers which just about every computer system and understand. It becomes machine readable because data contained within tags or attributes within tags.
IFC is data model that can be expressed in different data formats. STEP (an ASCII 1980’s geometry attribute exchange format) IFC XML (currently the long version) and a variety of Data Model Views STEP, XML, CSV or XLS / flat file formats the most widely known one being COBie. Quick word about JSON (Java script object notation) this is a more compact data format which is becoming popular for large datasets such as geometry because most of the content is just numerical coordinates.
A real world BIM example
Within the BIM world this is not new. BIM Server http://bimserver.org which is an open BIM platform for federating and sharing IFC Models is in fact a data service that uses a database and a web API but you do have to either buy a 3rd party GUI (graphical User Interface) or, if you are a developer, build your own. It is open source written in the Java programming language so you do need to be a bit savvy to run the server as it requires the correct version of the Java runtime. Being JAVA it will run on just about any type of computer or operating system. (BIM Server uses the Berkeley Database backend which is pretty fast but the API and communication is in either XML or JSON it’s your choice)
At my company (activePLAN) we have a client briefing system called IMPACT which is a web based application for the Client to manage the brief and Teams to report against how they are going to comply and with what solutions (We use SQL Server RDB as our backend and XML in the data service ). Initially there where 2 ways to interact with the system. Use the web interface or export and import spread sheets. However, in the design process the designers would have to work up their scheme and continually cross reference to the relevant section of the brief.
In a simplistic view you could add all the space requirements attributes to each space in the BIM CAD System and work out some way of importing this data. If the brief changed then it would be a manual process of checking and updating the model.
Our solution was to take advantage of API’s which all these BIM CAD systems have so we created an IMPACT plugin add for Revit. The connection information lives inside the model and actually exports in IFC. Each space can be mapped from the model to the brief and again this connection is exports in IFC. When the user click on a space a docked window appears with all the briefing information and even uses the data in the model to compare against the brief. When it comes to fixtures and fittings these can be compared and we even have an over view of the model that thematically traffic lights spaces that are not complying with the brief. All this is using Data services.
What about to-day
So what can Data Services do for the BIM process and all that request information process.
Product Manufacturers could run product information data services based on standard and adapted product type templates. They would not be tied into expensive portals and services and could inexpensively reach a worldwide market.
Design, Construction, Installation, FM Contractors, Client FM could run their own product libraries harvesting information from product manufacturers, receiving data from their teams and filling in the gaps. When product manufacturers update information such as operation procedures then simple information polling would inform clients to retrieve the latest information automatically.
The Building model now becomes are virtual model which we can have a whole host of applications accessing and editing from within the appropriate tool. The reality of how people work is that they cannot all use the same tool on a single model file. They need to work on their bit with the tool that does the job they need to do.
Future applications could mine data that clients make accessible for life cycle energy performance which could be used for informed selection.
Here is a typical scenario. The construction manager is looking at the building model in the site office and clicking on the objects in the model to review what has been installed. The installer is on the 10 floor with their mobile phone capturing the barcode on the same item. He clicks and the manager see’s that the bar code has been entered and the date time it was set. The Installer is also taking a photo (just to prove that object is installed) that photo is now accessible by the design manager. The Client’s information asset requirement is that they are supplied with some key manufacturer and model data as defined in their Asset Information Requirement. The Installer has not provided this information but by capturing the QR code he has provided the application with all the information to get the manufacturers product data. In the QR Code was the Product Manufacturers Data Service details and Product reference. This data picked up by the application the Installer is using which now initiates a call to check that the installers product library has the information. If it does not then the application will initiate the call to the Product manufacturers data service get a package of data, update the installers product library and push it in the project library.
When the COBie data is generated and passed to the client the client’s FM manager loads it into his COBie visualizer and click on the same object see’s the barcode data, who entered it and the photograph of the object and the time and date it was taken it may even have the GPS coordinates so he knows it was in his building.
In another example the Contractor is responsible for selecting a product. The design team have issued a product type template it is a version issued as a standard. The Contractor is looking at 3 different options 1 of which they have in their company product library and 2 others which are new to the market are not. On the web page of each of the 2 other products is a BIM icon this works a bit like twitter or other social media icons except the BIM icon requests a data service request. Clicking it pops up a window telling the contractor that they are currently signed into their own Product Library and that they can initiate a data transfer of all or selected parts to be added to their Product Library. The contractor proceeds and now all the product data is in their product library. As this manufacturer did not exist it now does as does the product and any-other associated data or files including COSHH, maintenance regimes, CAD models, Photographs installation instructions etc. This product data set is based on the very same product template the design team are using. The contractor does the same for the other product. Within the product Library he can compare all 3 products data sets, decides which one and then adds the selected product to the project data service where the design team can review and approve.
What’s good about all this? If it not obvious then
Who wants to sit all day tapping a keyboard entering, importing data that’s why we have computers.
I believe that in the future all product manufactures will be running their own or subscribing to a product library data services using certified product templates types, which transmit data in XML in an IFC product data format. Common Data Environments will have to evolve to support data services and BIM Library support will become less relevant freeing designers from limited kits of parts because they can create their own generic and connect them to data services which can be managed from with the application and from others via a variety devices on low cost applications.