March 21, 2008
  BIM: Reaching Forward, Reaching Back

by Michael Tardif, Assoc. AIA
Contributing Editor

Summary: The ongoing dialogue about building information modeling (BIM), particularly among architects, tends to center on a small group of software applications. But as our understanding of BIM continually expands, a whole suite of specialized software applications is emerging that extends BIM technology forward—to project stages beyond design—and backward—to project stages before design. This is a good sign that the technology is evolving to support two aspirational goals of the building industry: integrated project delivery and a business culture that views each stage in the life of a building as an integral part of its entire life cycle. But as software offerings proliferate, so do the decisions that firm leaders have to make about which tools to purchase and implement. A few good benchmarks for choosing applications include how well they reduce cycle time, leverage the professional expertise of the firm, improve the quality of service and deliverables, and enhance team communication and integration.


What we now call “BIM authoring tools”—design applications such as Autodesk Revit; Bentley Architecture; and Allplan, ArchiCAD, and Vectorworks, all by Nemetschek—will likely remain at the core of any BIM-based design process. These tools are powerful and complex—attributes that are reflected in their cost, the skill required to master them, and the degree to which they alter existing CAD-based business and design processes.

Another distinct group of applications has emerged that can be loosely categorized as “audit and analysis tools.” These highly specialized tools analyze—rather than create—building information models, though some have the ability or potential ability to add to or increase the richness of information contained in a model. They typically do one thing very well, such as energy analysis (Ecotect), code compliance checking (Solibri Model Checker), or clash detection (Navisworks Jetstream).

Application to projects on the boards
A hallmark of many audit and analysis tools is that they allow designers to apply large repositories of statistical data to specific projects, a type of comparative analysis that, unautomated, remains prohibitively time-consuming and expensive for most projects. Some of these tools allow users to add data from their own project experience or institutional knowledge to the statistical database, allowing users to customize the repository of historical data to include their own esoteric or proprietary knowledge. Because of their limited, specialized functionality, these tools are relatively less costly than BIM authoring tools, easier to learn and implement, and less disruptive to a firm. Most importantly, they allow design firms to highlight their specialized expertise. Applying specialized professional knowledge about energy-efficient design, for example, is much faster and easier with an analysis tool such as Ecotect, particularly for comparative analysis of various design scenarios in the early stages of design. An intuitive sense for an optimum, energy-efficient design solution can be confirmed, enhanced, or disproved by a more rigorous analysis based on scientific data.

As a business strategy, recouping a firm’s total investment in BIM may well depend on choosing and deploying the right set of audit and analysis tools to leverage fully the value of the information firms create with the more expensive authoring tools. It is much easier to calculate the return on investment with audit and analysis tools than authoring tools, because it is possible to bring their benefits to bear on projects almost immediately, even if the information they generate is shared only internally. This aspect, in particular, should not be overlooked, particularly as it pertains to the potential risks of integrated project delivery. Firms can use the data these applications generate—profitably—to improve the quality and reduce the cycle time of their own design analyses before they begin sharing the data with others. As their understanding of the nature, quality, and integrity of the data increases over time, they will be in a better position to set reasonable expectations with team members and clients about what the information means and how to use it appropriately.

BIM models need to be developed to a certain level of detail to leverage the full power of audit and analysis tools. While most of these tools can be used for comparative analysis of alternative designs in the conceptual stages of projects, generally speaking, the more detailed the model, the more accurate the analysis will be in predicting actual building performance. Still, the comparative analysis of schematic models can be extremely useful for early decision making, provided, again, that everyone understands the nature of the information being generated and relies on it in an appropriate way.

BIM in planning and programming
A third group of applications extends BIM back to the planning and programming phases. The Onuma Planning System (OPS) is the most visible offering in this category, but the field is ripe for development, and others can be expected to emerge. In addition to providing the industry with the first database-driven information-synthesis tool built on a Web-services platform, OPS blurs the boundaries between the planning, programming, and conceptual design phases of projects by allowing project teams to engage in rapid prototyping. Significantly, as well, the mere existence of OPS has spurred other software developers to improve the information-exchange capabilities of their applications, further advancing the industry’s aspirational goals.

Smack in the middle of the conceptual design stage of projects, yet another category of BIM tools is emerging to bridge the huge gap between visualization and modeling tools. A distinguishing characteristic of these applications is that they generate information for design decision making by referencing statistical data circumscribed by a set of project-specific design assumptions, rather than by rigorous analysis of project-specific data in a fully developed project model. They differ from audit and analysis tools in two subtle but important ways: they tend to make the distinction between the underlying statistical data and actual project data that they use for analysis more clear, and they allow users to document early, “high-level” designs and assumptions more explicitly. These attributes help reduce the potential for misunderstanding among project team members and clients about what the resulting analysis means.

Conceptual design tools are not full-featured authoring tools, per se, but they do generate conceptual BIM models that ostensibly can be imported into a BIM authoring tool at a later stage. The benefit is that they provide access to reliable data that would be difficult to generate in the conceptual stages of projects using conventional BIM authoring tools. DProfiler, a conceptual design and cost estimating tool, is in this category. Originally developed for internal use by the Beck Group—a Texas-based design and construction firm—for its own projects, DProfiler is now available, after 15 years of development, as a commercial product through the firm’s Beck Technology division. While about half of the firm’s first 150 customers are contractors, the next largest customer segment, at 25 percent, consists of architects. The remaining 25 percent of customers are in the owner group: developers, educational institutions, and other end users.

Users can begin modeling directly in DProfiler, or begin by importing information from a visualization tool such as SketchUp, over which a DProfiler model can be created as an overlay. Conceptual models created in DProfiler can be subsequently exported to BIM authoring and other tools through the buildingSMART Industry Foundation Class (IFC) format. According to Andy O’Nan of Beck Technology, conceptual modeling is faster and more automated in DProfiler than in the conventional BIM authoring tools. As the conceptual model is developed, a real-time cost estimate is tallied on the basis of the integral RS Means cost database, which is regionalized for over 1,000 U.S. cities. As the model becomes more and more refined, underlying statistical data or assumptions can be overridden by the project team with actual project-specific data.

Minimize the unknowns in risk management
As O’Nan points out, conventional “value engineering” takes place at the very late stages of projects—usually the bid phase—and is little more than a cost-cutting exercise. Because it is too late to change many fundamental decisions about a project, the first thing to go is the architecture—finishes, details, even major aesthetic features—regardless of the original commitment of the owner to design excellence. DProfiler allows teams to comparatively analyze the fundamentals, such as structural systems, mechanical systems, and cladding systems, so that their selection can be optimized and their relative value assessed on a level playing field with the architecture. As the product develops, more resources are being added to enable life-cycle and energy consumption analysis, and the product is increasingly integrating with related tools such as Sage Timberline Office. Architects armed with such a tool will be in a better position to provide their clients with clearer choices about costs and benefits of alternative designs, and can increase their own institutional knowledge and expertise with respect to construction cost, even if, initially, that knowledge is used only internally.

Liability risks for design professionals can be loosely grouped under two headings: duty risks and unknown risks. Duty risks are the risks that design professionals knowingly assume by virtue of their professional license and the binding agreements they sign with clients. These are risks that, to a large degree, can be quantified by actuarial analysis. Professional liability insurers, then, can put a price on that risk. Unknown risks stem from, well, the unknown. Integrated Project Delivery, seamless data exchange, conceptual cost-estimating—all of these activities present potential risks to design professionals that cannot be quantified because there is so little prior experience on which to assess the risks. One way to minimize the unknown risks—and keep them from turning into future duty risks—is for all project team members to have a clear understanding of the information generated at every stage of a project and its intended purpose. In this, architects can play a key role in advancing the aspirational goals of the industry while enhancing their position in the industry at the same time.

Copyright 2008 Michael Tardif. Reprinted with permission.

 
home
news headlines
practice
business
design
recent related

› BIMStorm Hits LA
› Workflow: The Mantra for 2008

Michael Tardif, Assoc. AIA, CSI, Hon. SDA is a writer and editor in Bethesda, Md., and the former director of the AIA Center for Technology and Practice Management.

The statements expressed in this article reflect the author’s own views and do not necessarily reflect the views or positions of the American Institute of Architects. Publication of this article should not be deemed or construed to constitute AIA approval, sponsorship, or endorsement of any method, product, service, enterprise, or organization.