Standard content for Members only

To continue reading this article, please login to your Utility Week account, Start 14 day trial or Become a member.

If your organisation already has a corporate membership and you haven’t activated it simply follow the register link below. Check here.

Become a member

Start 14 day trial

Login Register

Market view: Define your data destiny

There’s more to getting the most out of data stored in multiple silos than simply deploying the right technology – you have to care too, says Franz Winterauer

While buzz phrases such as big data, analytics and real-time insight have become part of our business vernacular over the past decade, the thirst to get more value out of data is nothing new.

If you look back to the days of traditional analytics, you will be reminded that, 30 years ago, business users sought to get value out of data just as much as they do today. The way the practice has evolved, however, is having a significant and detrimental impact on the value of analytics today and potentially for the future.

Technologies to provide more information for decision-making were adopted ad hoc to meet precise business and operational requirements – and to satisfy the appetite of data-savvy individuals in different departments. This has resulted in an innumerable quantity of data silos with hundreds of applications sitting on top of them. These applications were acquired in “stealth mode”, to deliver value in a speedy and flexible way – at least, that was the promise. As a result, utilities’ data stores have evolved into multi-application setups supplied by multiple vendors.

Specific gains can be generated at the application level, but full business value can be harvested only if the data is of the required quality and consistency. Utilities with data in silos run the risk of implementing good-looking systems with fancy, blinking dashboards, but without gaining the quality of insight they need. The problem is about to get worse, as cloud delivery makes it even easier and cheaper to tap into the “superficial” benefits of even more siloed applications.

Enter a solution to this chaos… the platform. Or maybe not? Platforms can provide enormous efficiency and value, but you cannot simply plug data from multiple applications into a single platform and get a holistic picture across service departments, functions or assets. So although it is enticing, the single-platform approach as a means to secure more value from data – and finally consolidate your architecture – is an illusion.

There are a couple of reasons for this. First, the individual strengths of the different software products chosen by utility firms remain a market reality. As a result, it is unlikely the asset management, control room, smart metering team and finance teams will agree on a single approach. Second, those teams have invested both time and money to get the most from their tools, making them reluctant to change the way in which things are done.

Nonetheless, the reality is that analytics can no longer rely on the old-style, one-directional approach. Analytics simply can’t exist in parallel with operational systems that get sent the data. In short, implementing the newest, shiniest bit of tech on top won’t guarantee the data quality and consistency needed. In a modern architecture, analytics must be embedded and interconnected and quality controlled. Only by achieving this, will you be able to implement central analytics with big data, and subsequently distributed analytics at the edge, making “small data” a reality for tomorrow.

So, rather than a single platform, it is possible to extract more value from your data using an architectural approach that is open, multi-directional and embedded. You should still rely on your platform strategy, but be prepared for it to be multi-platform and about more than the application layer. Utilities must also take care of the analytics enablement layer or AEL.

How can this be achieved? The shortest answer is that you need to care. Caring means accepting that deploying nice apps in the cloud will not void the garbage in, garbage out principle. Caring also means understanding that the architecture, as well as the AEL, will not be “plug and play”. While there might be multiple technology solutions for managing data, data owners must understand there is one enterprise data strategy.

The good news is that there is help out there and a great deal of the effort can be outsourced. Although you should never outsource being in the driving seat: the way you handle your data depends on the use cases you are driving, which should be tightly aligned to your business strategy.

As part of its Grid of the Future project, CPS Energy in Texas has partnered with Omnetric Group to help define and implement its analytics strategy. CPS Energy recognised that the data used to generate information must come from both operations technologies and the IT systems that manage related business processes. CPS Energy views analytics as a core competence, which will allow them to enhance their operations and better understand their customers in an effort to provide the best products and services possible.

So there are some good examples of best practice cropping up. But what concerns me is that, to date, we have barely scratched the surface when it comes to revealing the potential of grid analytics. Stopping at the app approach or believing in the platform promise will hinder the industry’s progress.

Given the multi-directional nature of the evolving electricity system, the utilities industry has a greater need and desire to understand what is happening at any point on the grid, at any given time, than ever before.

For sure, continue to invest in an intricate diverse ecosystem of technologies worthy of the rich data you hold. But also rely on a team of people who are able to navigate the new era of analytics, and realise that you have to care if you are going to have data-driven insight that can have a material impact on your long-term business performance.