Toward a new generation of Digital Twins

Andrea Lario
SISSA mathLab
Published in
5 min readDec 3, 2020

Digital Twin is one of the trending topic of Industry 4.0, and it is usually defined as a digital replica of an existing product; this is a simplistic definition, since Digital Twin can reproduce also processes, facilities, buildings, entire cities, and, in perspective, human beings.

The first definition of Digital Twin dates back to 2002 and it was proposed in the Model Based Systems Engineering framework, but the idea of creating models of existing objects which behave as their real counterparts was already present in the 60s; at the time for the pioneers of simulation it was probably a dream more than an idea.

In the last decades the technological and methodological progresses radically changed our perspective and now Digital Twin are becoming both a valuable aid during the design and production processes and a reliable support to the product during its entire life-cycle.

During the design phase the creation of Digital Twins by connecting models of different nature help to tackle down costs and the time-to-market since it reduces the number of physical prototypes that one has to build. Moreover modifications can be easily introduced by changing parameters and therefore optimisation processes and sensitivity analysis can be easily performed.

Digital Twin would be beneficial in order to design more efficient, “greener”, and safer products and, once that the product has been developed, it can be integrated with its physical counterpart. During the life cycle, Digital Twins help to take good decision by providing advanced information and by simulating different scenarios.
For example, digital models of existing products can be used for:
- predictions, e.g. modification of the behaviour by changing the settings.
- virtual sensing for an advanced monitoring of the product
- predictive maintenance and early failure detection
- real time optimisation, when coupled with AI algorithms.

Do Digital Twins differ from standard simulations?
Both traditional simulations and Digital Twins run in virtual environments that are representations of the physical ones but the digital Twins are also capable to integrate real-time data.
In general Digital Twins must perform the following operations:
- data acquisition through sensors
- simulation
- visualisation of the results.
In addition, Digital Twin are used for representing complex systems which generally involve different physical domains and a high number of parameters. Eventually traditional simulations are employed almost exclusively during the design phase of a product for virtual testing purposes, while Digital Twins are intended to support the corresponding physical assets for their entire life-cycles.
In detail three different levels of integration between the physical asset and the Digital Twin are possible: none (the Digital Twin is used only for simulation, this is how a virtual prototype is used since the physical counterpart does not exist), one-way in which data flow from sensors to the digital model, two-way when the physical asset both feed data to the Digital Twin and receives input from the model (e.g. real time optimisation).

Two-way information exchange between real asset and the related Digital Twin

A new paradigm for Digital Twins: build with equations, learn from data
As it has been stated in the previous section, Digital Twins consist of near real time multi-physics and parametric models, given their ambition of mimicking real objects and the complexity of nowadays products and processes.
In addition a technological limit is represented by the fact that many edge devices, on which Digital Twin have to be installed, have a modest computational power onboard.
Traditionally, to overcome this problem, Digital Twins were created by connecting simple 0D/1D parametric models, but for many applications this is not sufficient anymore since complex simulations are required in order to obtain the desired level of information.

A new paradigm is therefore needed and Machine Learning, HPC, Data Analytics and Reduced Order Models are the pillars on which this new paradigm is based on.

In detail, Reduced Order Models are used for including in the Digital Twin descriptions of two-dimensional and three-dimensional physical phenomena governed by Partial Differential Equations, which are too expansive to be solved in real time from the computational point of view with traditional numerical techniques; on the other hand data-based models can be used in order to create black-box models which relates input and output. The latter approach is non-physical since no equations are involved but it is efficient and sometimes is the only way to model a sub system when, for example, design details of components are not known. The non-physicality of the solution could be critical when cases governed by complex multi-scale, multi-physics phenomena are considered: small changes in the conditions can cause big behavioural differences, thus an equations-based approach should be preferred when high consequences applications (e.g. aerospace fields) are considered. This approach is leading also to emerging web computing technologies via web-servers.
On the other hand data management and Machine Learning-based techniques are fundamental for performing data assimilation, an operation which is essential in order to guarantee a tight consistency between the behaviours of the models and the ones of the physical counterpart during its entire operative life. In fact in this way one is able to keep track of the natural degradation of performance a physical component undergoes during its activity because of weariness, corrosion, and consumption.
Even if the final models generated by the ROMs and the data based approaches can run in real time on devices with limited computational power such as smartphone, tablets, and laptops, the training phase required for their generation can be expensive and for this reason the High Performance Computing is a valuable aid for the generation of the Digital Twin models.

In conclusion
It is difficult and fascinating at the same time to talk about Digital Twins because one does not talk about it per se but in relation with all the others innovations which are occurring in these years such as ROMs, AI, data, sensors, IoT, and HPC.
Moreover it is an argument which is challenging to generalise since Digital Twins are tailored on the specific application at hand and they can be applied to a huge variety problems, but, regardless the field in which they have been applied, Digital Twins have proven their capability, especially when they are coupled with AI algorithms, of refining product design, real-time troubleshooting, advanced monitoring, and testing new ideas.
For sure blending Machine Leaning, sensor data, and equations based models is the key for closing further the gap between digital and real worlds and for creating the next generation of Digital Twins.

--

--