In 2017, “hosting capacity analysis” is the buzz phrase heard in utility, developer and regulator circles. The Interstate Renewable Energy Council defines hosting capacity analysis as “the amount of DERs that can be accommodated on the distribution system at a given time and at a given location under existing grid conditions and operations.” At that time, advocates for distributed solar Photo Voltaic (PV) were pressing regulators for efficient and transparent interconnection processes.
Flash forward three years. Utility and customer-scale distributed energy resources (DERs) have increased. Energy storage deployments are ramping up. Electric vehicle (EV) public charging providers are building out networks at a time when charging patterns are not well known. At the same time, regulators in some states are requiring utility annual forecasts and hourly profiles. In these extraordinarily complex and dynamic environments, analytics can help.
The typical interconnection process works like this: a developer applies to the utility or ISO for an interconnection permit. The utility/ISO responds with its approval along with the cost of interconnection. In addition to fees, there may be costs associated with upgrading the network. For example, upgrades to substations and/or feeders may be needed to support fast-charging depots. Of course, the process really starts before that. To encourage the installation of DERs in places that will benefit the grid, regulators require that utilities make locational information available to the public.
Tracking transactions is easy…advanced analytics, not so much
It has been relatively easy to assemble the technology that supports the transactional processes related to interconnection. Visualization displays help developers and utilities plan for future installations. Displays show where DERs could provide benefits to the grid. For example, New York State has a portal displaying capacity maps in ArcGIS.
Tracking applications in the interconnection process is fairly straightforward as well. Generic workflow tools will work, although utility-specific applications are available from companies such as Clean Power Research and ANB Systems.
The hard part is putting together the technology to support two other segments of the process–the hosting capacity analysis and the evaluation of interconnection applications. Short- and long-term forecasting capabilities are involved, as are power flow calculations. That is where more sophisticated analytics come into play. The GridUnity is an example of a platform designed to make this process more robust and streamlined.
MISO recently rolled out GridUnity’s cloud-based platform to support its generation interconnection process. For MISO, the evaluation of interconnection applications has been labor-intensive. That process was sufficient when the number of applications was limited. A recent visit to MISO’s generator interconnection queue active project map showed 415 projects in the queue. Seventy three percent are renewables or storage projects.
Both transmission and distribution hosting capacity analysis can be done on the platform. PG&E is using GridUnity to support hosting capacity analysis, using “automated analysis and management-by-exception workflows to ensure completeness of the model data and resolve power flow issues.” The ability to bring in and automate both transmission and distribution allows utilities to do hybrid modeling.
From the start, data inputs need to be complete and trusted. The GridUnity platform has an automated process for calibrating and validating data that uses machine learning. That is especially important when there are thousands of feeders to review.
The GridUnity platform is “modeling system agnostic.” This enables utilities to standardize, automate and scale their analysis using both commercial and open-source power flow solvers. For years, utilities have relied on engineering and economic models to do grid planning. Transmission and distribution solvers, including power flow (PSSE, PSLF, CYME, Synergy) are well accepted in the industry by utilities and regulators alike. The GridUnity platform gives engineers access to verified and validated models that the utility is using in-house, which promotes familiarity and consistency. The user also can bring in open-source algorithms that are currently available, such as models used to determine optimal power flow and run large-scale power simulations. Even more important, the computational scale of the cloud platform allows thousands of simulations to be executed.
This approach is not the only option, however. Southern California Edison opted to build its own hosting capacity analysis capabilities by “integrating disparate vendor products, customizing and configuring each product, and operationalizing them within SCE’s production environment”, according to SCE’s 2021 General Rate Case for Grid Modernization, Grid Technology and Energy Storage. SCE is also looking for these tools to deliver annual hour-based load and DER forecasting plus risk-based distribution project portfolio management. The build out is on-going and more detail can be found in SCE Develops Systems Planning tool for Grid Modernization. If you want to see how much the built-it approach could cost, check out the rate case – it provides a lot of detail.