Standard content for Members only

To continue reading this article, please login to your Utility Week account, Start 14 day trial or Become a member.

If your organisation already has a corporate membership and you haven’t activated it simply follow the register link below. Check here.

Become a member

Start 14 day trial

Login Register

Seeing double

Digital twins have been used by the likes of NASA and Formula 1 and, thanks to recent advances in cloud computing and sensors, utilities are exploring their potential to improve asset monitoring and maintenance and more accurately simulate different effects on the network

Heavy rain in the sewers of Auckland causes frequent overflows that flush raw effluent into local rivers and harbours, posing a serious health risk for the thousands of swimmers who frequent local beaches.

In an effort to better track and mitigate the problem, Auckland Council worked with Mott MacDonald’s Smart Infrastructure business to create a powerful, so-called digital twin of the city’s wastewater infrastructure.

The award-winning tool, hosted on the cloud-based platform Moata, combines up-to-the-minute data from the wastewater and stormwater network with weather and tidal data and various predictive analytic models.

This digital simulation is able to generate a real-time forecast of water quality at 84 beaches and eight freshwater locations around the city and serves up live advice on swimming conditions via the website Safeswim.org.nz.

Digital twins stand at the forefront of innovation and, thanks to the increasing penetration of Internet of Things (IoT)-enabled devices, machine learning techniques and building information modelling (BIM), are expected to play a vital role in the digital transformation of infrastructure.

Digital twins make it possible to accurately map the location and condition of assets, minimising costs associated with maintenance and repairs. They can provide a virtual ‘safe space’ to test out adverse network scenarios and innovations without having to interfere with the operation of real infrastructure. The benefits are increased when multiple models from different sources and sectors are linked together to leverage each other’s data to improve municipal or national planning and co-ordination.

Sarah Hayes is the senior policy adviser at the National Infrastructure Commission (NIC), which is spearheading the development of a National Digital Twin of UK infrastructure. She says: “Utilities are developing digital twins, which is great for them and for their customers, but we will see an even greater public benefit if we bring those and other models together to improve the way infrastructure works in a local area, in a city, or nationally. It’s a long-term vision, but the building blocks are being put in place.”

A digital twin is defined as a digital representation of a physical asset or system, which provides information about its current design, state, condition and its history. A twin can be used to improve decision making around what future infrastructure to build, or how to manage current and future infrastructure. A prerequisite is the inclusion of some element of ‘live’ data and a connection between the physical entity and the twin.

The 2018 Gartner Hype Cycle, which assesses the maturity and adoption of emerging technologies, lists digital twins as among the technologies likely to achieve mainstream business adoption in the next five to ten years. Major software vendors, such as Microsoft and Bentley, recently launched tools that enable infrastructure owners to capture, simulate and interrogate their assets as digital twins. Bentley’s iTwin Services cloud platform enables the creation of both civil infrastructure projects and operational infrastructure assets.

The technology has the potential to provide utility companies with greater visibility of network performance, and quickly crunch the numbers when planning complex scenarios such as an outage or the impact of adding in new infrastructure.

Decentralised energy generation

Engineering consultancy Accenture has developed digital twins for electricity transmission companies to help plan for complications related to decentralised energy generation and the increasing penetration of renewables.

Rohit Banerji, global lead on the development of big data analytics platforms at Accenture, observes: “Our clients are beginning to ask for twins to dynamically plan generation, operational routing, and to run scenarios to understand how the congestion will play out in the bi-directional grid. For example, if solar was to grow in a particular part of the country, or a new wind farm was introduced, how will it impact on congestion?”

Power distribution companies have used twins to simulate the future impact of electric vehicles and related high-volume power storage and fast charging on a network.

Digital twins are very efficient engineering problem solvers. Where a traditional desk study might require extensive manual data gathering, engineering calculations and analysis using software or spreadsheets, twins can combine real-time data streams from various sources and run them through sophisticated algorithms and machine learning to produce rapid results.

Mott MacDonald’s Smart Infrastructure business used this type of approach when developing the digital twins for Auckland Council and a “number of water and sewage companies in the UK”, which cannot currently be named for non-disclosure reasons.

Oli Hawes, head of smart infrastructure, says: “We’re trying to tap into real-time data from across their businesses, whether that’s from rain gauges, their GIS [geographic information system] or their customer relationship management system. Many data sources are themselves already digital twins, for example a digital representation of a hydraulic model built in software. We connect all that together, then layer in other whole system data sources, such as the national weather, river level data from the Environment Agency or tidal information, to create a digitised desk study that runs in real time.”

The Moata platform is able to run around 30 different scenarios in three seconds, where previously it would have taken 14 days using conventional techniques, says Hawes: “The power goes exponential when you start to move past data collection and visualisation on its own to link your models to the real system and unlock value from those additional insights.”

Digital twins are still a nascent technology and various technical, economic and behavioural challenges mean it may be many years before utilities have fully functional models of their entire assets.

Bird’s eye view

The most advanced twins today typically focus on small-scale high criticality problems, such as a specific issue with an energy substation, or long-term planning scenarios that are more tolerant to a lack of real-time data sources and rigorously accurate engineering models.

Building an all-encompassing ‘bird’s eye’ view of a network, including live operational data, would require major investment to retrofit IoT sensors into existing infrastructure and upgrade IT systems. Some networks are constrained by older infrastructure that’s been in place for 40-plus years.

However, that level of forensic detail is not necessary to benefit from the technology, says Samuel Chorlton, project lead at the Data and Analytics Facility for National Infrastructure (DAFNI): “There isn’t one digital twin that solves all things, there are multiple types that a business might look to develop to answer different business questions and the associated data required to drive that is going to vary too.”

When working with clients, Smart Infrastructure tries to break down the need for sensors into a problem-solving exercise around ‘data management, sense making and decision making’. “When clients follow that process they often find they have some of what they need already, and where there are gaps data science takes care of it and we can build the insight piece from what we have,” says Hawes.

The success of digital twins is largely reliant on data interoperability and the ability to share data in different formats, but historic siloed thinking in the infrastructure sector threatens to limit the available insights.

Heavy rain in the sewers of Auckland causes frequent overflows that flush raw effluent into local rivers and harbours, posing a serious health risk for the thousands of swimmers who frequent local beaches

NIC sets the agenda

The NIC recognised the critical importance of data sharing to improve infrastructure performance in its groundbreaking report “Data for the Public Good”, published in 2017.

The government’s response was to ask the Centre for Digital Built Britain (CDBB) to lead the development of the Information Management Framework that will lay the foundation for the development of digital twins and ultimately the creation of a National Digital Twin (an interconnected ecosystem of twins for infrastructure including utilities, roads, rail, schools and hospitals, etc.)

The CDBB has published Gemini principles, which set out a high-level picture of what digital twins should look like, and a basic roadmap for the Information Management Framework.

BIM is expected to lay much of the groundwork for the Framework, explains the NIC’s Hayes: “BIM is all about having data for building infrastructure in a common format and the Framework will build upon that and apply it to existing infrastructure so we can label what we have already got so we can use it integrally and interoperatively with new infrastructure.”

A digital twin hub (DT Hub), a collaborative learning community for those who own or develop twins – including government, asset owners, standards organisations and academia – was launched in April in an effort to translate many of these ideas into reality.

DAFNI’s Chorlton was appointed as chair of the Steering Group for the DT Hub. He told Flex: “We are going to look at questions like: how do we standardise our approach? What does the development of digital twins look like? What does a simple digital twin look like, versus a complex digital twin? How can we provide a common ontology and a common taxonomy so that when, for example, an energy digital twin is trying to talk to a water digital twin, there is a common language they can use to interact? We want to steer everyone in a similar direction so we are contributing to the same outcomes.”

According to Chorlton, technology is the easiest piece in the digital twin puzzle, it is industry culture and a reluctance to open up its data that will be hardest to overcome. “We’ve got to provide reassurance that this can be done in a responsible manner and that it is not going to make your company vulnerable to commercial losses or make us vulnerable from a national security perspective. We’re confident in our approach and how to leverage the technology, it’s now a matter of bringing industry along for the journey,” he concludes.

Many data sources are themselves already digital twins, for example a digital representation of a hydraulic model built in software

Sensors, resilience and strategic decisions

Northumbrian Water is working with researchers at Newcastle University to develop three digital twins designed to improve its operational and strategic decision making.

The water utility invested around £120,000 in the project, which will see four PhD researchers from the University develop models using data from an open source network of sensors installed across the city, plus other sources.

The first twin will aim to capture the biogas upgrading process, whereby sewage sludge is processed to create biogas. Biogas can have propane added to increase its calorific value for supply to the gas network, which adds costs, or it can be fed into in-house combined heat and power engines to make electricity.

Chris Jones, research and development manager at Northumbrian Water, explains: “The idea is we capture those processes as a series of mathematical models, then bring in sensor readings from the processes themselves, plus live information on things like the market value of gas, and the digital twin will use analytics to advise on the best use to make of the gas. The results will be fed as some kind of decision support to the operational team.”

The second twin will aim to predict the various impacts of an operational incident on the network, such as a burst water main, which can traditionally prevent engineers from intervening on the network because access is blocked, either by water or by traffic disruption.

The first prototype combined a surface model of the city with hydraulic modelling software to show where water from a burst pipe, at any location and with a given flow rate, would move over 90 minutes. That is now being expanded to create a browser-based tool to allow any authorised user to quickly understand where the water will flow and manage the response, including sharing the results with the emergency services and city authorities to help them understand the scale of the likely disruption.

The final twin is more strategic and long term and will aim to understand future demand on wastewater and water services. It will capture in detail how customers currently use their services and then identify trends such as the efficiency of white goods, or the effect of people paving over their gardens, which changes the water use and the drainage characteristics of a catchment.

“Capturing those individual choices at property level, then aggregating the data to understand what might be happening at a catchment level, can help us understand whether the current service position will be sufficient,” says Jones. “It’s about understanding what technology and changes to behaviour might mean for us in the future so we can adapt our services to ensure they remain resilient and relevant.”

This article first appeared in Flex, issue 3. Read the full issue of Flex here