I had the following interesting discussion regarding analytics trends with Raiford Smith, VP of Energy Technology and Analytics at Entergy. Raiford brings more than 26 years of industry experience with strategies and innovations related to supply- and demand-side analytics, and has been involved in partnerships with other utilities and OEMs to create new technology, to develop new markets, and to lead cross-functional, international partnerships in Europe, Korea, China, and Japan.
Peter Manos: As a member of Utility Analytics Institute (UAI), you are familiar with their Analytics Value Curve. In what ways do you think these stages and labels clarify matters, and in what ways do they over-simplify?
Related: Inventing the Future: Raiford Smith on DERMs and Data Analytics
Raiford Smith: I think any labels and any models we employ are going to do both things—clarify at times and oversimplify at times. It all depends on the context. But the UAI Analytics Value Curve is relatively sophisticated in comparison to other examples I’ve seen recently.
Peter Manos: What makes the Utility Analytics Institute chart different?
Raiford Smith: At a recent workshop we were assessing our level of analytics sophistication, and the survey we were using for self-assessment had “Business Intelligence” as the lowest possible sophistication stage. To me, it was a bit comical to put Business Intelligence as the lowest possible stage. Depending on which department you are talking about at a typical utility, you could easily be several phases behind or ahead of the “Business Intelligence” phase. That’s not to say utilities don’t have analytics capabilities – it just typically hasn’t been a uniform journey across the enterprise.
But while the UAI Analytics Value Curve helps clarify things, I think your suggestion is still correct about ways these labels can oversimplify the journey. For one thing, there are a ton of inter-dependencies, which complicates this sort of “evolution” model.
Peter Manos: Shouldn’t this be a 3D rather than 2D chart, with layers to capture, for example, how you cannot do sophisticated “behind the meter” Customer Engagement if you do not have grid data analytics in place?
Raiford Smith: Yes, but data availability is only one kind of inter-dependency. Data governance – another key inter-dependency – determines just how good your data really is. Getting access to data (like AMI) is just the starting point. Having quality data is even better. Data access and data governance are both part of a multi-dimensional approach to analytics that encompasses an enterprise perspective, not just the needs of one particular silo.
Peter Manos: But I’ve heard experts claim the best way to proceed is to pick one area, make a business case, and then apply analytics tools to solve problems in that one area. And I’ve also heard experts say that each group wants to only hear about what other folks in the same area are doing—so that Customer Analytics stuff should be kept separate from Supply Chain Analytics or Grid Optimization Analytics, rather than cross-pollinate analytics successes across these areas. Do you think these ideas are right or wrong?
Raiford Smith: The suggestion that a utility “pick one area to start your analytics journey” might fit when looking at a set of problems for a particular silo of our business. But it won’t fit when thinking about the needs of the enterprise. Our company (and our data) is more than just one silo. It is an interconnected organism where data from one area very much impacts the next, and all are interwoven together in a complex tapestry that decries a silo-based approach. To maximize the value of analytics, you need to start by developing a platform that will be flexible (and cost-effective) for the widest-variety of use cases both today and tomorrow. We don’t know all the things analytics can do – why start by designing it to only tackle one silo’s worth of problems?
Peter Manos: So it sounds like serious mistakes could be made by not thinking holistically from the start — and it also sounds like the folks in the Geospatial Information System (GIS) world who have been calling for “one version of the truth.”
Raiford Smith: I certainly think it is riskier (and costlier) to take a nearsighted, silo-by-silo approach than to design the foundational analytics capabilities of the company around an enterprise-wide approach. And I think the GIS folks are on to something – we all need to have confidence in the data to make sure our insights aren’t garbage. But we also need to find a way to de-risk our implementation so we’re not puttering around, trying to re-invent the analytics equivalent of the wheel.
One way to do that is to recognize there are a handful of common analytical models to utilize for any particular situation we face.
In other words, regardless of which problem you’re trying to solve, most data scientists will typically start from maybe a dozen common model types such as logistic regression, random forest, or k-means clustering types. So, if you want to get to the truth faster, realize you may already have some models built you can quickly re-purpose. This should help de-risk your implementation and lower costs.
Peter Manos: In this regard I’ve heard some pundits say utilities need to imitate other industries, like Amazon’s retail, and also that utilities need to compete more with each other, but I wonder if the opposite is true, and whether utilities need to leverage unique strengths that make them good at being utilities, and even start to cooperate with other utilities more, rather than compete with other utilities more. So do you think utilities should compete more with each other, or cooperate more?
Raiford Smith: Cooperating, and cross-pollinating ideas and data-related insights across utilities, is absolutely important in helping our industry move the state-of-the-art forward. Such cooperation is very important, as it helps us optimize our models overall, for our analytics work.
Peter Manos: I find myself wondering about the need to apply analytics tools to our strategies themselves. If you look at our industry generally, based on the amount of sensors and real-time data being utilized now, and look at where we will be in the future, there is such a huge increase on the way. When you look at where we are going to end up, with more on-site generation and Demand Response, and millions of electric vehicles, we are talking about having billions of real-time data points at those end use customers. Compared to our 10,000 or so generating facilities being synchronized in their regional control areas now, it will be a multiplier of 10,000 to 100,000, not just a doubling or tripling, of real-time data usage.
How will we manage 20 to 40 year T&D assets with software, communications and control systems which will have shorter, 2 to 4 year life cycles? How will we standardize these systems, from a systems engineering/reliability point of view, so that our engineers can “do” engineering and not get bogged down in hundreds of different communications and control protocols?
Raiford Smith: It involves a change in the overall framework for this sort of work.
Thus, in order to maximize your investment, I would recommend creating an enterprise analytics roadmap which include three key capabilities regardless of use case/silo: data governance (to insure the data and insights are high-quality and are defined, managed, and tracked), data storage (to collect structured and unstructured data for curation, reporting, and analytics in a cloud-based environment), and standards-based analytics tools (to turn the data into information and insights). From an enterprise perspective, you can gain a more cost-effective/speedier ramp to analytics maturity if you avoid buying use case-specific tools to solve specific problems. After you have the right platform, you should have the right ingredients to prosecute use cases for any silo of the business. Focus on one area? No problem. Shift to the latest C-suite priority? No problem. Provide better insights and help transform an organization or the whole company? No problem. The order and priority of use cases is entirely up to you as long as you build the right, flexible, standards-based foundation. Utilities should stop looking at their work (and data) as if it resides in a silo. It doesn’t.
Peter Manos: Wow—that is one big enchilada! I think about how Dr. Martin Luther King Jr. said, regarding big changes, “It always seems impossible, until it is done.”
Raiford Smith: (Laughs.) Yes. And you are right–it is a 10,000 to 100,000-fold increase, not a 2x or 3x or 10x one. And we need to have a lot of other things in place, from a business, technology, and regulatory point of view.
Peter Manos: I agree with your idea about the whole utility enterprise needing to approach data and analytics like one big organism. But how are we going to interact with that organism? I think about the time when Steve Jobs built the i-Pod, way back when, not the i-Pad but the i-Pod, there were all sorts of unexpected benefits when they set a goal of simplifying the human interaction with the device—they made that circular single-touch control interface with the dot in the middle of the circle. In unexpected created a revolution. How can we create a revolution?
Raiford Smith: At the end of the day, we need a robust, interoperable, digital grid. From it, many things are possible – data, advanced optimization and coordination, and new capabilities. But to your point, it is about starting with the right analytics approach, and working your way up through the organism. Think about the example of AMI and how it got started — it was implemented to lower the cost of meter reads, provide remote disconnect/reconnect, and reduce “non-technical losses,” but ended up bringing us digital automation as, in my view, its greater value.
Similarly, our analytics journey will bring us greater value than we expect, and we will get there in unexpected ways as well, even if they seem impossible. We are building a beautiful cake, and I cannot tell you if it will be vanilla or chocolate or red velvet. But I can tell you it will be delicious!
For further insights from Raiford Smith and other industry thought leaders, don’t miss out on the Knowledge Executive Summit and Utility Analytics Week, both taking place in San Antonio this Oct. 31-Nov. 3.