This is the Sponsored paywall logged out

Login Register

Is the idea of presumed open data a bridge too far for utilities?

It’s hard to pass a single day in the UK utilities sector today without being hit by some new and usually urgent message about the transformational potential of data for the sector.

Customer and asset data, we are repeatedly told, have the power to disrupt existing utility market structures, the competitive landscape and expectations of organisational performance.

Companies that successfully re-engineer themselves to allow dynamic data flows to drive their activities will reap rewards in customer acquisition and retention or regulatory outperformance, vendors promise. Furthermore, they have the opportunity to carve out new, higher value roles for themselves in future markets.

Most utilities have taken these arguments for investing in new data architectures and capabilities on board and have embarked on programmes to gain better visibility of organisational data, spread intelligence into “dumb” areas of their asset base and put in place analysis tools and talent to extract what is popularly termed “actionable insight” from data.

The maturity and success record of these programmes is mixed across the sector – though as Utility Week has previously observed, it is generically true to say that many utilities are struggling to push beyond “initiative-based” digital innovation.

With this in mind, how realistic is it to hope that the sector could unite to share data across organisational boundaries and amplify the value of data sets for the good of customers, the UK economy and the environment?

That question drove most of the discussion at a recent Utility Week event in London, hosted in association with Talend, including participation from data and technology leaders at a cross section of utilities.

The debate was spurred by opening remarks from Richard Dobson, co-author of the recent Energy Data Taskforce report: A Strategy for a Modern, Digitalised Energy System. Dobson shared the report’s recommendations – which were broadly welcomed by government and industry when they were issued in June this year – including an ambition to adopt a “presumed open” principle for all energy system asset data.

Putting the presumed open principle in place, Dobson proposed, would mark a move away from trying to get companies to release data for key trials and experiments and instead challenge companies to justify why they could not share data. The change in approach could facilitate innovation and accelerate the utility sector’s response to grand challenges like the 2050 net zero emissions target.

The response of event participants to this ambition was mixed, with a strong element at the table expressing concern and caution about the many difficulties and risks they perceived to be involved. Many of these related to security and commercial interests, but there were also warnings about the practical challenges to be overcome in terms of aligning taxonomy and data labelling across multiple systems in order to avoid perverse outcomes from data sharing.

Then, too, two participants expressed worry about the response of regulators to any cross-sector open data initiative. One had scorched its fingers before by admitting data weaknesses in the interests of collective learning, only to be penalised for its honesty.

Benefits of data democracy

Notwithstanding these issues, there was a more enthusiastic grouping around the table who latched on to the opportunities that could arise from achieving data democracy.

Interestingly, while a number of collective benefits were eyed in terms of effective and resilient system operation, the discussion that generated the most excitement was focused on the opportunity to use data collaboration to help shift the dial for public perception of utilities and transform the sector’s relationship with customers.

Advocates of this view argued that using a clearly communicated open data programme to position utilities as leaders in the fight against climate change would align them with a “greater good” and help lift the insidious cloud which so many consumers see looming over the sector.

Doing this effectively, they said, would deliver far more value than any pounds and pence figure that might be forecast from an individual company’s commercial exploitation of data – and could well create and wider watering hole of commercial opportunity and economic uplift.

To help unleash this potential and motivate less enthusiastic companies, several attendees recommended that utility regulators consider incentives to advance the adoption of open data principles and prompt engagement in undertaking the necessary harmonisation of data referencing, as well as the build of a common data catalogue.

As one attendee observed after the event: “There was such a contrast between those who accepted we must share data and those who clearly had some very entrenched views. If we want the benefits of data sharing, there will need to be some intervention to mandate the more progressive view.”

Uniper’s business-driven data science strategy

Attendees at Utility Week and Talend’s event also benefited from detailed insights into the journey Uniper has taken to transform the value of its data.

Uniper – the generation and trading company that span out of Eon as part of a strategic reinvention process in 2016 – knows the value of real-time data to enable agile decision-making.

“As a trading organisation, data has always been something we were very focused on,” says Rene Griener, vice president data integration at Uniper. “But over the past ten years, the number of new players and disruptors that have come into the market mean it now matters more than ever that we can price risk in the market correctly, and for this we need more information and to be able to act on it more quickly.”

To respond to this pressure, in 2017, Uniper embarked on a campaign to achieve total visibility and democratisation of organisational data, and to support its exploitation with a business needs-focused squadron of data scientists.

The journey started with the creation of an organisation-wide data vision and a leap into the public cloud with the build of a central data lake – with support from Talend – using the popular data warehousing technology Snowflake. A commitment was also made for the data science function to only work on data use cases that had sponsorship from a business decision-maker.

“We wanted our focus to come from decision-makers,” says Griener. “We knew that to make a real difference, our momentum had to come from those who felt the pain every day of not having the information they needed.

“We also knew that those decision-­makers are the ones who hold the most valuable and commercially important data. So it was essential to our mission to break down silos that we engaged them.”

At the start of 2018, Uniper’s cloud architecture was ready and Griener sought out a business sponsor to volunteer a use case that could prove its potential.

The use case looked at the burden placed on the business function to produce daily intelligence reports which compiled information from exchanges, brokers, weather stations and other data sources, and to use this information to run models to support effective trading.

“The activity was taking up around 70 per cent of their time, so it was a real pain point and we knew they had not been able to get support from IT before to ease this, because the sources of data used changed too frequently, meaning the structure of the data also changed.”

The new architecture, combined with the new business service mentality of the data science function, meant the use case took three weeks to address and the trading team came away with a solution that freed up their time and empowered them to nimbly leverage a broader set of intelligence sources than ever.

Once word got out among decision-makers that the data science was now a business resource – rather than IT – submissions for further use cases surged.

“Within a year we had over 120 data sources connected [to the data lake],” says Griener. “Currently, we have 40 use cases in production, which equates to 1,500 users from within the business.” There are also around 400 users from third party organisations, and Uniper has successfully monetised outputs from data use case experiments.

The journey has highlighted the value of data science capabilities to the organisation and glamourised the skill set involved. “Everyone wants to be a data scientist now,” says Griener. But the shift from data reconciliation and processing to true data science is a big one, and he says it’s important to be able to manage expectations around the potential for retraining.

The value of data science has also raised questions about the strategic argument for freeing up more IT budget to expand a data science unit.

Opinion Mark Truswell

Enterprise sales director UK&I, Talend

Democratising access to data realises the value of that data immediately.

There are no technical blockers today to achieving the kinds of outcomes from data exploitation that utilities are now talking about as future ­scenarios. However, there are psychological blockers and entrenched, top down or “waterfall” approaches that make achieving the necessary transparency and accessibility of data very difficult.

For example, we are working with a large energy company that was struggling, with its current vendor, to get the value from data that it wants.

It has a centre of excellence in the IT function for data science, and it will raise exciting use cases which it puts project teams behind. But six months after a project launches and with £100,000 spent, they may only just get access to the data they need to act on. They wanted us to create a better technology solution to get that data to them faster and free up IT to work on the use cases.

Our proposal to them is simple. We can make data available through any existing interface they have – it doesn’t matter who the vendor is. What is key is that we can make data available to everyone, everywhere, in near real time and in a way that has strong governance and approvals.

By democratising access to the data and letting the business own the use cases, the value of data will be realised immediately. But that is a big leap of faith for many utilities today.

Also, crucially, when companies take this step, they must think carefully about data taxonomy. Too many times I’ve seen companies invest time and money into bringing data together and working on use cases, and they fall down because there is an argument between two or more stakeholders about the meaning of some small phrase which fundamentally skews their interpretation of the data.

When this happens, projects end up getting parked while teams work backwards to find out where data originally came from and so which interpretation of its meaning is correct. It’s a huge waste of time and resource – but it’s not a technology problem, it’s a lack of thought.

All the tooling exists to track the linear sources of data. All the tooling exists to support the necessary workflows and approvals of data. What is so often missing is attention to detail in labelling data or a willingness to think differently and openly about how data from one part of the organisation could be used and interpreted by another – or a third party.