How are utility regulatory models evolving, and what role will analytics have to play in this new world?
Advanced analytics tools being adopted by leading utilities are not only facilitating uptake of new technologies such as IoT devices—they are becoming an increasingly essential element of market and associated regulatory mechanisms.
Evolution of Markets and DER
As energy resources are becoming increasingly distributed, data and analytics will be at the core of forecasting and valuing power and load-shifting opportunities in real time. Advanced analytics tools will also play a central role in facilitating and sending real-time signals to markets required for distributed energy resources (DER), and enabling new transactional relationships with customers and third parties.
The key changes for regulators to integrate to support this new ecosystem involve more than a program-by-program fine-tuning of the way ratemaking methods incentivize utilities. Key assumptions underlying the traditional cost-of-service regulation no longer apply, especially the old model having assumed a vertically-integrated utility with steadily growing load with very stable market conditions and technologies.
The New York State Public Service Commission (NY PSC) put it well when it said: “Achieving the most productive mix of utility and third-party capital will require utilities to forego – and, crucially, to plan to forego – some level of capital investment on which they would ordinarily earn a return. Even if capital and operating expenses are treated identically, utilities will still be required to plan for a substantial level of third-party involvement in the system and, correspondingly, a reduced utility share of total expenditures.“ (Matter 14-00581, page 35.)
This implies that the domain for utility planners will include the surrounding ecosystem of third-party players and of new customer relationships. The larger domain across which these utility planners will have to apply new advanced analytics tools will increase the complexity and the potential value addressed (and captured) by these tools.
No More Business as Usual
Fundamentally, advanced analytics also has a pivotal role to play considering ever-increasing operational, customer, and economic complexities associated with optimizing utilities’ performance.
Since more real-time operational and customer behavior data and associated O&M costs are available to utilities now than ever before, utilities are in a better position to allocate capital optimally and to more rapidly justify repair/replace decisions for T&D equipment that will increase reliability or and/or quality of power delivery.
In all cases there is a data-driven look back (benchmarking) and look forward (forecast) for the basis for rewards and punishments doled out during periodic rate cases, whether those rate-making processes are based on traditional cost-of-service models, or on newer performance-based (PBR) ones.
Historically both benchmarked and forecasted models needed to comprehend sensitivities to changes in costs of capital based on interest rate and related market fluctuations, as well as changes in fuel costs and the price of purchased power, and changes in electricity usage due to factors ranging from housing starts to weather-sensitivities of average utility customers, and consumer-price index (CPI) related trends.
Consider fully justifying just one such technological investment, such as a new investment in asset health. Asset health investments include a range of new sensors and modeling tools and predictive engines and repair/replace models and cost of service impact models, to better leverage real-time performance data for fine-tuning of all related O&M activities to extend asset life across a wider portfolio of assets.
For such an investment to be justified, there must be a model of how a whole T&D and generating system would have performed cost-wise, service-wise, and repair/replace wise, across one or more “with versus without” scenarios. The scenarios involve how the system would have performed with, or without, one or more sets of advanced analytics investments.
As this type of example shows, these sorts of models have a lot of burdens put on them. The models need not only to compare what the investment costs were, versus expected investment costs, but also what the expected versus subsequent actual O&M costs and benefits were, and what they would have been in the absence of the investment under consideration.
These upgrades’ forward-looking justifications will also subsequently re-appear in the rear-view mirror, as they are employed for getting a regulatory reward for recent good performance after having made such an investment, or for a cost disallowance (a regulatory slap on the wrist) after poor performance.
Counteracting Challenges and Complexity
Utilities cannot underestimate the magnitude of increases in complexity associated with the technologies being deployed to leverage the value of DERs and other customer-based resources and to improve grid reliability and resiliency. Associated constraints are also on the rise, since utility personnel need to deploy increasingly sophisticated advanced analytics tools to justify various forecasts and what-if scenario modeling, as well as associated budgeting processes as part of their market-building and rate-case work.
This all points to the pressing need not only for the best possible advanced analytical tools, but also for the widest and most accurate and up-to-date data sets upon which to apply those tools. Utilities also need to have standards which enable maximal sharing of data while respecting privacy concerns. The increasing strength of analytics communities amongst utilities is a testimonial to the value being derived from leveraging data and best practices amidst the dynamic changes in the industry.