November 25, 2019

Enhancing Digital Twins Part 1: Harnessing Data and Predictive Maintenance to Enhance Digital Twins

By

Theta

Part 1 of a 4-part series about displaying Predictive Maintenance insights in Digital Twins, using Azure Databricks. Before we get into the setup details, we’ll introduce the concept of Predictive Maintenance and the tools we used to make predictions.

We've previously explored simulating Digital Twins of IoT devices within Theta’s augmented reality (AR) platform, Mixiply, to create an end-to-end Digital Twin environment in AR.

This allowed us to visualize physical IoT devices alongside sensory data - temperature, pressure and humidity in AR, in real-time. We’d previously discussed the potential of Digital Twinning, concluding that “with the help of AI and Machine learning, the data collected [from Digital Twins] could also be used to predict and diagnose problems before they even happen”, among other things.

Digital twin scenarios

Since then, we’ve been implementing Digital Twins at an even greater scale using Mixiply. We are currently capable of creating expansive Digital Twins of real-world locations such as train stations.

Part of a large-scale train station Digital Twin in Mixiply/AR
Part of a large-scale train station Digital Twin in Mixiply/AR

We are redirecting IoT sensor data of physical entities to trigger incident alerts in its associated Digital Twin, helping personnel to remotely monitor areas of concern in the real-word environment. As we’ve alluded to already, our next agenda is to avoid running into concerns over time by harnessing collated data for fault prediction and diagnostics. Operational equipment would greatly benefit from this sort of routine monitoring; this is a practice known as Predictive Maintenance.

We’ve recently constructed a large-scale Digital Twin of a warehouse in Mixiply, and as a part of the Digital Twin, we are conducting Predictive Maintenance upon its assets.

Overhead of a large-scale warehouse Digital Twin in Mixiply/AR
Overhead of a large-scale warehouse Digital Twin in Mixiply/AR

The following video is a walk-through of the entire twin, which shows a high-level overview of how we are harnessing Predictive Maintenance.

What is Predictive Maintenance and how does it work?

Predictive maintenance involves routinely analyzing equipment assets in order to estimate their next point of failure ahead of time, minimizing asset or operation downtime, maximizing operation productivity and avoiding unnecessary maintenance. Predictive Maintenance is supported by AI and machine learning, helping you make more informed decisions by analyzing legacy and real-time data. Predictive maintenance extends the practicality of our Digital Twins technology. In the following three blog posts, we will be detailing the entire process from start to finish.

Warehouse’s Digital Twin cooling asset suggestions in Mixiply/AR
Warehouse’s Digital Twin cooling asset suggestions in Mixiply/AR

Predictive Maintenance tools

But first, let's discuss the tools we've chosen to conduct Predictive Maintenance.

We are using R to compute our asset failure predictions as it has a solid library of data science packages, including survival and corrplot.  

You can interact with R on your computer through RStudio, an R-compatible IDE.

However, when the amount of data to be analysed overextends the limits of your PC’s RAM. What can be done then?  

This is just the job for Apache Spark.  

Apache Spark is a cluster-computing framework capable of distributing expansive programming workloads in parallel and in a fault-tolerant manner. Apache Spark has an R interface, facilitated through R’s ‘sparklyr’ package, which is also capable of processing hefty datasets in Apache Spark but on your local PC.  

Although working locally with Apache Spark is a viable solution, cloud computing is the de facto standard for computing services. Our Digital Twins processes are also largely cloud-based, so it seemed appropriate for us to compute our analyses in the cloud too. Using Azure Databricks to conduct Predictive Maintenance on our Digital Twins’ corresponding real-life assets was therefore, an obvious decision, as it is built upon Apache Spark and is also capable of processing massive datasets in the cloud.

And that's it for this post. We hope the concept of Predictive Maintenance in Digital Twins has piqued your interest. In our next post, we'll explain more about Azure Databricks and how to prepare it for Predictive Maintenance analyses.