Operational Analytics: Avoid “Get Rich Quick” Schemes

By February 19, 2019 No Comments

Defined in the negative, operational analytics are not about those precious “aha” moments or dramatic storytelling; they are about making better decisions in the moment.

While there are many popular press stories generated about the beneficial returns from investments in analytics for customer satisfaction, the ability to consistently tune the grid with the small adjustments afforded by operational analytics represents a significant opportunity for utilities.

Because of marketing buzz (and the need to justify expenditures), it is often assumed that only strategic analytics can bring the as-advertised dramatic economic impacts to the utility. But operational analytics, which support a wide range of low- to high-value decisions, can in aggregate produce high-impact results to the bottom line.

Operational analytics are built on massive amounts of data and constitute very low latency responses, that in a growing number of deployments steam along without any human intervention. However, the recent hard push to move intelligence to the grid edge with straight-through and onboard processing analytics remains a hotly-debated issue within the grid operations community.

This is appropriate, because a grid edge intelligence strategy overturns the existing philosophy of operations for the utility, which is still largely represented by a central operations center model. Still, as these investment priorities get sorted out, the conversation doesn’t need to obscure the benefits of more traditional operational analytics that leverage the use of data mining, predictive analytics, optimization, and simulation techniques.

The Center Still Holds.

Providing insight into the grid from the substation to the customer can be a boon for utilities. Sensor applications in this part of the grid improves the prediction and resolution of critical conditions and can provide improved reliability and security in a part of the network which has long been dark to operators. Depending on the application, utilities that ramp up sensor-to-analytic programs on the secondary, find that instantaneous reporting to system operators from sensors installed at substations, on transformers, and from the sensing abilities within smart meters themselves to be a winning integration effort.

The long-sought “transformative shifts” for the electric system in the U.S. are still gradual in many jurisdictions, but the utilities now have the opportunity to deploy instrumentation to perform tasks on the grid that were simply not possible (or too difficult) just a decade ago. For example, by aggregating performance characteristics and information about load, operators are capable of understanding system load and utilization to ensure that assets remain within their operational parameters over time. Indeed, the business case for asset management will likely remain the most persuasive among both public and private operations.

Opportunity is neither here nor there.

The management of intermittent renewable generation sources is another powerful case that can provide a significant impact on reliability and coordination that goes far beyond old-school supervisory control and data acquisition (SCADA) capabilities. A program of comprehensive analytics is not optional when a utility is serious about bring distributed generation resources brings energy storage, plug-in electric vehicles (PEVs), rooftop solar feed-in, and demand-response programs into the supply mix.

Unfortunately, utilities that desire to move in this direction still stumble over the basic tasks of description, classification, and clustering of data for problem diagnosis and prediction. This is due, in large part, to a lack of experience in understanding the value of operational analytics, how to invest, how to establish return on investment (ROI), and how to work operational analytics into business strategy and planning.

Yes, there is a lack of industrywide consensus about the appropriate philosophy for operational analytics, even as the technology required to support core processes is well established and proven. As always, utility organizations are well advised to start small and grow their efforts incrementally (even if quickly), investing not just in a big-bang technology but in managed rollouts that prevent chaotic organizational shifts.

There are over 3,000 utilities in the United States with highly-variable deployments, but the risk of a massive technology failure on the grid doesn’t come cheaply for any utility and its customers. High-quality, resilient, analytics for operations should be valued in the aggregate—developing powerful analytical models that are focused on efficiency, cost reductions, and the careful management of challenges as grid intelligence becomes more distributed, may be the fastest route to maximum productivity and profitability.


Carol L. Stimmel is the Principal Researcher at Interswarm and the author of several standard texts, including Smart Grid Data Analytic Strategies for the Smart Grid and Designing Smart Cities.