By Curt Lefebvre, Founding Director and CEO, nDimensional
The promise of digital twins is huge but implementing them at scale is hard. Asset-intensive enterprises have thousands of physical assets and business processes. Each has many data sources and digital twin use cases. Each twin consists of multiple models, which require frequent updates. And discrete twins must be integrated with each other and with other systems and applications to realize their full potential. This takes an industrial-strength digital twin platform.
Some vendors offer digital twin enabling technologies – e.g. for asset modeling, data management or IoT integration. Others offer portfolios of digital twins that pair with OEM-specific hardware assets. But achieving transformational value requires bringing all the pieces together into an integrated experience that goes from insight to action and scales enterprise-wide. Following are some of the key elements required.
Digital Twin Platform Requirements
- Big Data Management: A digital twin platform must rapidly, reliably and securely process huge data volumes and velocities. It must structure messy streaming data to always be in the right order and apply an event-driven architecture that triggers actions at just the right times.
- Full Lifecycle Management: A digital twin platform requires standardized workflows for building, deploying, operating and maintaining huge numbers of digital twins and their associated digital masters and models. It must seamlessly move from prototype to production application. And twins must be constantly updated to reflect reality and drive outcomes based on continuous intelligence.
- System-Wide Integration: A platform is required for discrete twins to be seamlessly integrated into composite or organizational twins. Equally important is the need to integrate with other systems to drive action – such as using open APIs and advanced streaming technologies to connect twins with enterprise systems like ERP.
- Hyperautomation: A digital twin platform must incorporate hyperautomation in order to deploy twins at scale, reduce complexity and accelerate value. This requires standardized twin templates and workflows, expert knowledge codification and the ability to automatically learn and evolve using AI. Examples include auto-instantiating new twins when a data match is found and automatically structuring, visualizing and applying machine learning to twin data based on pre-defined standards.
- AI-Powered Applications: AI enables digital twin value to extend beyond situational awareness to also predict what’s next, prescribe solutions and take optimized actions in real-time.
- Transparency: Digital twins by definition provide greater operational and business transparency. Digital twin platforms must go a step further and provide transparency across the full digital twin lifecycle – making things like model management and twin performance against goals easy to access, understand and share.
- Visualization and Analysis: Related to transparency, digital twin platforms should make it easy to visualize and interact with digital twin data. Users from different perspectives should be able to view relevant results as high-level dashboards or deep drill downs.
- Democratized Access: Digital twins will never reach their full potential if their creation and application rely solely on computer and data scientists. A scalable platform puts the power of AI digital twins in the hands of domain experts who know how to extract business value. It makes it easy for domain experts, business analysts and data scientists to collaborate in a shared envieonment. Examples of democratization include code-free twin development and AI model wizards.
Have ideas about other platform requirements? Interested in learning how the nD platform brings these elements together to manage the complexity? We’d love to hear from you.