A primer on digital twins

Home Test and Measurement News A primer on digital twins
digital twins

Digital twin, a technology that is revolutionizing sectors from manufacturing to telecom and those in between, is at the core of network assurance

As quality assurance becomes a critical business differentiator, industries from manufacturing and utility to space research and telecom are all turning to one technology that is reimagining the way we test products and services: Digital twins.

You may have heard the term before, but what does it really mean? And how has this single technology found such an expansive fit across so many diverse sectors? 

Digital twins have been gathering steam for several years. Some say that NASA’s Apollo 13 mission was one of the earliest documented examples. The Apollo 13 crew was trained using a set of specialized simulators which prepared them for unforeseen scenarios in the outer space — and later, those self-same simulators became the life-saving tech that charted their course back to safety. 

If you are looking for more modern use cases, think of Google Maps or BMW’s replica of its Regensburg plant in Bavaria — all great instances of digital twins in action. 

The term “digital twins” first took off in popular culture when Gartner named it one of the top 10 tech trends in 2017. Today, according to a McKinsey analysis, the global digital twins market is on track for 60 percent growth, and projected to reach an estimated $73.5 billion by 2027. And why not? The survey finds that 70 percent of top-level executives in big firms are exploring to invest in the technology. 

So what are digital twins? Digital twins are data-driven replicas of physical objects, systems, and processes. Think of it as a digital duplicate that features all of the nuances and complexities of the real-world counterpart. You see, a digital twin is created by mirroring the underlying physics of the actual object, which explains how it can so accurately mimic all the real characteristics and behaviors of the physical asset. This mathematical model can simulate real-world scenarios separate from the actual object, which makes it possible to test various tools and technologies to assess their impact without compromising operational continuity. 

In network management

A very popular use case is networking, where predictive maintenance is key to meeting SLAs. Telecom operators, data centers, and even defense companies are now leveraging digital doubles of their networks to make sense of the vast and complex fabrics, and ensure service assurance for their users. 

Earlier, it was not possible to anticipate a real-world scenario until it had happened, and networking teams had to learn from their mistakes. Often, the price was too high. Digital twins changed that completely by presenting the opportunity, for the first time, to accurately anticipate the sequence of activities that can happen in production before it actually happens. 

However, the digital twin technology is not to be mistaken for traditional network modeling. Here’s how digital twins are better. Network digital twins can:

  • Dynamically model the ever-changing behavior and functionality of a physical network throughout its entire lifespan in real time
  • Obtain real-time data from the production network by communicating with the fabric and its management layer
  • Reconfigure and optimize the physical devices when required using commands
  • Probe data collected from IoT sensors and monitoring systems to adjust its optimization strategies

With a network digital twin serving as a sandbox, teams can test out entire scenarios with data exactly the way it happens in production, without ever touching the production environment. The result is improved uptime, fewer outages, lower security risks, and an optimal user experience for all.

Naturally, the diagnostics provided by a digital twin model have to continually verified and validated to ensure confidence in the results. One of the frameworks to determine the accuracy and reliability of the predictions is VVUQ or verification, validation, and uncertainty quantification. This process analyzes how faithfully the model reflects the real-world systems it represents. However, as adoption accelerates, new and evolved evaluation methods are needed to ensure full fidelity of the models.

What you need to know in 5 minutes

Join 37,000+ professionals receiving the AI Infrastructure Daily Newsletter

This field is for validation purposes and should be left unchanged.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More