Attention aspiring structural engineers: I hope you paid attention in your freshman year programming courses. On recent projects, I have found myself laboring over programming algorighms and stressing how to manage large amounts of data. Some really cool advancements in modeling software enable structural engineers to utilize data that was previously discarded – but how to manage all that new information. There are two emerging trends in particular that I believe will strong influence the future of the industry. In the first half of this blog series I’ll describe a Forensic Information Model (FIM).
For some years now, BIM (Building Information Modeling) has been the buzz word du jour. The principle of the technology is that modeled items can can contain attributes. Until recently, that additional tagged data was mostly just used to define geometry. In CAD (hand drafting for that matter) a drawn line is just pixels on the screen (ink on the page), but a BIM element is a beam with a specific cross section, joined to connecting members, placed on a certain level within the building. This additional information makes the 3D computer modeling environment much more user friendly. Cutting sections on plan now only requires a few mouse clicks. This process doesn’t necessarily make drafting any more efficient – as now there is a whole lot more data to be managed.
BIM software has caught on largely because of the opportunity it presents for 3D visualization and rendering. Some steel detailers also have the capacity to import the models into their connection modeling software. This process expedites detailing and reduces some human error in plan reading. BIM software developers have long promised integration with analysis programs as well. We’ve had more difficulty with this data transfer and conventionally only expect to get the basic geometry to port accurately.
This is where the programming geniuses come in. Experts in my company have developed their own programming tool that can extract or insert additional data in the building information model. They call it the parameter explorer. On a basic level, this tool can be used to populate a column schedule from a formatted spreadsheet. If you think outside the box, however, there is much information that you might want to tag in a model, i.e. urls, photos, calculations, field data, etc. Our Building Performance group has used this technology to develop a Forensic Information Model (FIM).
We can now create 3D structural models and tag forensic information to individual members – photos, inspection notes, shop drawings, etc. On large scale projects, the data is archived in a database program. Custom attributes are created in the 3D model associated with the database information. The parameter explorer provides the link between the model and the data. Another homemade viewer program can be worked up to provide a user interface friendly enough for even a lawyer to use.
My boss recently presented our company’s use of FIM at the 2012 Structures Congress in Chicago. He described how the technique was applied to the Minneapolis I35W bridge collapse investigation. FIM was also used to reconstruct the path of a falling dumpster that caused extensive damage to a high rise in New York City. The talk is summarized in a recent ENR article, Structural Engineers Learn Lessons from Failures through Virtual Databases.
The end result of a FIM model looks super cool and the name just sounds really intelligent too. However, the programming behind the veneer is serious stuff, and managing all the data is a thankless job. Only a handful of people in the company really know how to set up the whole system from start to finish. Though there are those of us that know enough about one part or another to be really dangerous. Those with the knowledge of the modeling platforms (Revit, Catia, etc.), the database software (Access), and the programming language (VBA) are very valuable to the team.
Emerging trends in digital product delivery require management of lots of data. Transferring model data between visualization and analytical software requires complex data manipulation. The companies and individuals that learn how to handle this data will be more efficient and able to offer more objective evidence that their designs are better.