Home > Latest   +   Technology + Innovation

New Era of BIM Lifecycle Implementation – Part 4

New Era of BIM Lifecycle Implementation – Part 4

Part 4: A Cradle to Cradle Digital Twin Ecosystem for Building Asset Management

Continued from PARTS 12, and 3

By Dr. Eve Lin, Dr. Xifan (Jeff) Chen, and George Broadbent

Introduction

Previously in this series, we reviewed the importance of data management behind the model handoff during BIM lifecycle implementation, in terms of data interoperability, accuracy, and sufficiency. We highlighted the necessity of developing well-defined data requirements from an Asset Management / Facility Management (AM/FM) perspective and using them to regulate delivery phase data collection and population. We also described an AM/FM Data Dictionary Management System (DDMS) that helps address commonly-seen issues which occur during the Project Information Model (PIM) to Asset Information Model (AIM) transition as well as during operations and maintenance. This article further discusses the data flow during the entire project lifecycle from delivery to the operational phases and introduces the current trend of Digital Twins – an ideal BIM implementation scenario that needs to be built on top of a solid data foundation.

BIM Lifecycle Data Management

As discussed in a previous article, Figure 1 illustrates the top-down structure of a project lifecycle from different levels. It illustrates the fundamental importance of a well-defined and managed FM-oriented DDMS to the entire BIM program. While we emphasized the importance of a DDMS during the FM stage because of its long duration, and high operational cost during the entire lifecycle, a well-planned DDMS is a critical foundation to support BIM lifecycle data management. In the real world, even a well-coordinated data management plan could collapse at any moment due to small data glitches in the data exchange process and consequently impact the downstream data flow. Without certain governance and policies to standardize the workflows and processes, it is hard for organizations to maintain data interoperability, sufficiency, and accuracy because teams and individuals tend to work in silos and make decisions based on available information.

Therefore, we need to elevate the topic from the data or technology application level to the business process level, which includes but is not limited to BIM Standard, BIM Execution Plan, Contractual Languages, and Quality Assurance/Quality Control (QA/QC) Compliance Validation. This is where a DDMS can help with bridging all data gaps. In the following content, we will look closely into a data gap with real-world scenarios and describe how DDMS can benefit the project delivery and operational loop.

Figure 1: BIM Lifecycle Implementation needs to be regulated by a proper business process

Data Gaps

Data gaps are the most common problem in BIM lifecycle implementation yet remain difficult to resolve. A data management plan can break down on just one or more data gaps occurring during project delivery and operations. One example is the miscoordination among data collection and validation between the submitted BIM model and onsite data collection. In an ideal scenario, data from these two channels should complement each other in a synchronized way. However, this rarely occurs as planned. Figure 2 illustrates  ideal and real scenarios of asset data collection for facility management purposes. During the project delivery phase, data is collected from all channels (onsite, cut sheets, manuals) but some of it is not required by facility management. Moreover, what facility management really cares about is missed. Possible root causes are listed below:

  • There is no concrete FM-oriented data requirement.
  • If there is a solid data requirement, it is not included in the Integrated Project Delivery (IPD).
  • If the requirement is included in the IPD, there is a lack of a  QA/QC session to regulate the data submission.

    Figure: 2 Ideal scenario vs. real scenario

Closing the Loop

In this case, the DDMS starts by helping the owner or facility management build up a well-defined data requirement (with all the semantical analysis, and Artificial Intelligence (AI) recommendation functionalities described in the previous article). In this way, the facility owner understands what data they are expecting from upstream systematically. After the data dictionary has been developed, the DDMS will further polish the data requirement and assemble the data dictionary into the data requirement (parameter) packages and share with each corresponding stage (design, construction, commissioning etc.). Stakeholders in each delivery stage will be informed about the  types of data they are supposed to collect or populate so as to avoid the expensive task of collecting and coordinating asset data after the building has been occupied.  Lastly, the DDMS needs to be capable of analyzing submitted data against the hosted data dictionary to perform a compliance check. This process not only tests the data completeness but also the data validity and uniqueness based on each attribute/parameter requirement. The DDMS will also perform data analytics and reporting on analyzed data for the data or BIM manager to retouch their data collection.

In short, the DDMS will help clients with what needs to be collected as well as when and who needs to gather it.  The entire process is centered around the defined data dictionary to eliminate any data miscoordination. Figure 3 illustrates the loop of data collection and validation driven by the utilization of DDMS.

The gap described above varies and occurs all around the project delivery. The aggregation of data glitches here and there adds up to the failure of the entire data management plan regardless of how good it was originally. The combination of a well-defined data dictionary, a solid business process that ensures the successful implementation of the dictionary, and effective data management tools can help clients overcome these gaps and eventually get on the right track to close the loop.

The Call of Digital Twins

The Phrase Speaks for Itself

The concept of a Digital Twin has been around since 2002 and has quickly spread into manufacturing, healthcare, automotive, aerospace, and other industries. The basic definition of a Digital Twin is a digital replica of a physical entity that connects the physical and the virtual world to enable data synchronization between them. In other words, the Digital Twin is a living model that reflects and keeps up with its real physical  counterpart, including not only the representative geometry, but current conditions and data.

In the Architecture, Engineering, Construction, Owner/Operator (AECO) industry, a Digital Twin does not need a specific level of development, detail modeling requirements, or technology/process utilizations.  From the owner’s view, what he receives at the end of the delivery phase will be a digital replica of what has been built onsite, which, most of the time, includes everything the owner wants to know.

Once a project has gone through the delivery phase, onsite commissioning, field/asset data collections, populations, and validations, a Digital Twin can be developed and handed over to the owners. The Digital Twin will fully connect and sync with the owner’s computerized maintenance management system (CMMS), updating itself periodically during the entire service life of the facility. Facility asset information, from detail to high level, can be synchronized into the Digital Twin, and perform analytics and reporting from there. Meanwhile, the 3D representation of the facility can help the operation managers or line workers to accelerate work order response time, enhance data interoperability, and increase FM efficiency and efficacy. Being a living digital replica, the Digital Twin keeps itself updated during the facility’s lifespan, and is ready to serve as a refreshed “as-built model” for the next renovation. In this way, a Digital Twin reshapes the expectation of “Cradle to Grave” to “Cradle to Cradle” BIM lifecycle.

Figure 3: A Sample loop of data collection and validation

Digital Twin Ecosystem

The living virtual reality mapping mechanism enables analyses and reports of real-world data to handoff problems before they occur, prevents failure, and possibly develops new plans. The concept of a Digital Twin perfectly connects BIM, CMMS, Business Intelligence (BI), Artificial Intelligence (including machine learning, deep learning, etc.), data science, GIS, and the Internet of Things (IoT) naturally and logically.

“Living” is a keyword in this connected process making everything possible that does not apply to a traditional “static” as-built model. An effective and functioning Digital Twin for facility management requires a well-balanced five-dimensional ecosystem, as illustrated in Figure 4, including Physical World, Digital World, People & Organization, Business Intelligence, and Digital Connections. How effective and intelligent a Digital Twin will be relies on how well each dimension is developed and connected.

Physical World: Data Capture Capability

To have a virtual replica of the physical world depends on how well the Digital Twin can capture real-time data. The advancement of sensing, monitoring, and measuring technology along with IoT allows an abundance of real-time data to be collected and utilized. Evolving from traditionally monthly metered utility bills, real-time outdoor and indoor environments and building systems can be captured, measured, and monitored for operational and maintenance system opportunities. However,  the effective response path will rely on the business intelligence of the Digital Twin, which will be described in the fourth dimension of the ecosystem. In addition, the collected data quality also impacts the response results. To prevent garbage in and garbage out accurate and validated data must be obtained to ensure data integrity for multiple purposes.

Digital World: Digital Model Maturity

The second dimension of the Digital Twin ecosystem is related to the maturity of the digital model. With the adoption of ISO 19650 and an FM-oriented DDMS as a backbone, a streamlined process from PIM to AIM becomes possible. While several BIM performance measurement metrics, maturity models, and tools were developed to gauge the performance of BIM, the measurement metrics for an FM-oriented Digital Twin are still not clearly defined. However, the maturity of the data model can be roughly gauged based on the data quality (i.e., data accuracy, richness, consistency), data capability, and lifecycle support.

People & Organization

How people and an organization interact with and utilize a system is the determining factor of the effectiveness of a Digital Twin. No matter how advanced technologies are or how rich the data model is, a self-driving car won’t start until the driver knows where the power button is and to push it. The people & organization aspect is an often-overlooked area. On the individual level, how well is an individual equipped to interact with their necessary systems? How can you increase an individual’s competency? On the organizational level, how efficient are the processes within the organization? How do you improve the organizational process from a decentralized operation to a centralized and agile process? Sometimes matching users with the tools they know how to use is more efficient than giving them a supercomputer that they don’t know how to run.

Business Intelligence

With the solid foundation of the collected data, data models, and clearly-defined business logic, this  dimension focuses on how to make the system intelligent, such as applying AI, ML, and an Artificial Neural Network to replicate the human cognitive process and learning behavior to perform tasks and improve performance. With the foundation of real-time data and asset information, big data and advanced predictive analysis can be utilized to bring the unorganized data to several actionable insights to increase the accuracy of prediction and support decision making. The services involved include but are not limited to condition monitoring, function simulation, evolution simulation, dynamic scheduling, reductive maintenance, quality control, etc.

Digital Connections

The final dimension includes the six digital connections that bring together the previous four areas, including the relationships between the (1) physical model and digital model; (2) physical model and business intelligence; (3) digital model and business intelligence; (4) physical model and people & organization; (5) digital model and people & organization; and (6) business intelligence and people & organization. How to enable the data transfer from a manual input and error-prone process to a streamlined and highly autonomous level is another key to the effectiveness of a Digital Twin. Numerous technologies are currently available to support these connections, such as the internet, communication, interaction, collaboration technologies, as well as human-computer interaction technologies, i.e., Virtual Reality, Augmented Reality, and Mixed Reality.

The connection between the physical and digital world ensures that the collected real-time data are reflected dynamically. Through the connection between the digital model and business intelligence, the data can be analyzed and generated with the corresponding actions, followed by the connection between business intelligence and the physical world. The corresponding control can be sent back and executed in the physical entity. The connection between people & organization and business intelligence allows business logic, processes, and services that dynamically reflect the generated work orders and preventive maintenance in the business intelligence system to be sent back and deployed to different actors within the organization.

Given many different models and inputs among different dimensions, to ensure smooth data interaction, a standardized data exchange protocol with unified communication interfaces and standards becomes more important than ever before. A standardized data exchange format reemphasizes the importance of an FM-oriented DDMS. With that as the backbone of the data management system and the foundation of the connection protocol, well-connected pathways can then be established to support a healthy Digital Twin ecosystem for managing the entire project lifecycle seamlessly.

Solid Foundation + Well-Balanced Development is the Key

The progression of technologies in different dimensions of a Digital Twin brings a tremendous amount of potential and possibility, which cannot advance without maintaining a well-defined DDMS as the foundation along with a well-balanced ecosystem. From the data foundation perspective, a well-functioning Digital Twin involves an increasing amount of data streams and complexity for maintaining the consistency, integrity, and interoperability of data collected from all sources, including but not limited to roll-in and roll-out of new data formats and old data formats, data from upstream BIM, as-built model, and collected IoT data.

Figure 4: Digital twin ecosystem

Consequently, establishing a solid DDMS as the foundation is a must for the success of a BIM lifecycle implementation or a Digital Twin system. From the technology implementation perspective, well-balanced development towards all dimensions is essential. For example, IoT technologies enable the connected operation process and reshape data utilization for facilities and asset management. However, the implementation process is not just a plug-in and done scenario without challenges.

First, a standardized DDMS needs to be established to ensure seamless data exchange flow. Then, all stakeholders involved in the process must work collaboratively and understand the methodology to utilize the innovative solution effectively. This means the organizational and user capabilities need to increase simultaneously to maintain the balance of the ecosystem, which requires coordination between facility operations, IT, and business leaders. The learning curve of each applied technology determines the degree of the overall implementation and further affects the system performance. Lastly, from the financial perspective, a well-established Digital Twin can require high capital costs for system implementation, including DDMS deployment, various sensing and monitoring technologies installation and configuration, as-built model and data collection, business logic programing, and user training.

Despite that, the Lifecycle Benefit/Cost Analysis of a Digital Twin demonstrates promising productive Benefit/Cost Ratio (BCR), ROI, and Payback Period. However, the expense is still a determining factor that impedes clients from adopting or understanding the system at the outset. Moreover, associated policy modification, execution plan, adaptability, and other external influences indirectly impact system implementation positively or negatively. Nonetheless, a Digital Twin as the final objective of BIM lifecycle implementation has become a widely-adopted vision and goal to ultimately leverage available technologies to their fullest potential for building lifecycle management.


Dr. Eve Lin is a EAM Strategy Consultant and Sustainability Lead at Microdesk, Dr. Eve Lin specializes in providing strategic and technical solutions for clients to facilitate sustainable practices throughout the project lifecycle. Her involvement includes building performance simulation, design automation, BIM and GIS integration and development of digital twin solutions.

Dr. Xifan Jeff Chen is the EAM Assistant Director at Microdesk, and head of EAM Strategic Advisory Service. Jeff specializes in providing strategic consulting services for clients, conducting and implementing BIM, EAM and GIS integrated solutions, and developing digital twin methodologies for lifecycle BIM implementation.

George Broadbent is Microdesk’s Vice President of Asset Management and has worked on a variety of projects including the rollout of Microdesk’s Maximo and Revit integration solution, ModelStream. George works closely with key stakeholders to identify strategies for asset management projects and manages the effort to build out new systems.