Diversification of Power Generation Brings Greater Need for Data-Based Decisions
Utilities and other companies in the electric power sector are using data to solve the dilemma of decarbonizing the power grid while also diversifying their generation resources.
Utilities and power generators know that collecting data is critical to their operations. The process has grown in importance as the power grid diversifies, with more renewable energy being added to existing thermal generation.
Software and other tools are providing the electric power industry with more data from a variety of sources, supporting the ability of energy producers to make better decisions when it comes to their assets. Capturing real-time performance data promotes reliability and resiliency of the power supply, and helps optimize operations to better serve customers.
“As the power sector decarbonizes, the mix of generation types will change in obvious and non-obvious ways,” said Allan Schurr, chief commercial officer of Houston, Texas-based Enchanted Rock, an energy company that serves several industries and is known for its innovative microgrid designs (Figure 1). “The obvious changes are increased wind and solar energy, along with storage to time shift these variable resources to meet load. The non-obvious changes are the shift in conventional resources’ operation from baseload to low-capacity factor, dispatchable capacity to balance supply and demand in infrequent tail events—such as heat waves and polar vortices—when extreme load and limited renewables are available.”
Schurr told POWER, “In these periods, the performance of conventional resources must be assured and data analytics will be needed to ensure performance when it’s needed. For example, in extreme heat, derating of transmission lines due to high temperatures and loading could create a need for more local dispatchable capacity. Likewise, extreme cold can be difficult to forecast given growing electric heating, so the forecast error needs to be expanded and appropriate contingency addressed. In both of these instances, data analytics increases asset performance and system operations through real-time sensing, modeling, and decision support.”
Schurr and other industry experts who spoke with POWER noted the importance of having real-time, accurate operational data that can provide the necessary insights to make informed decisions about power generation and delivery. They agreed that utilities must invest in data validation and verification technologies to ensure the quality of their data.
Energy Efficiency
Julian McConnell, director of owner’s engineering at Bureau Veritas, a France-based testing, inspections, and certification company that works with the electric power sector, told POWER, “Efficiency in the energy context is all about delivering energy with as little loss as possible in transit. Data helps operators monitor everything that’s happening in real time, which means they can adjust operations to save energy or costs. Power plant operators taking a proactive monitoring approach will likely have a smoother operation overall with less downtime. By using data to fine-tune operations, power plants not only save on costs but also contribute to a greener environment by reducing unnecessary energy use.”
“As the sources of power become far more distributed, digitalization is vital to effectively manage and balance decentralized generation infrastructure,” said Philippe Beauchamp, director of utility solutions at Eaton. “The predictive diagnostic and network modeling capabilities of power engineering and analysis tools are major enablers. These types of solutions help utilities make sense of their collected data to better plan for capacity additions, forecast production, optimize energy storage, support reliable operations, and closely manage the health of energy assets to support more resilient power.”
Renewable energy (Figure 2) also benefits from data integration. The interconnection backlog for projects such as solar and wind farms can be reduced by better scenario planning, and through grid simulations that provide more visibility into the impact of renewables on the transmission and distribution system. Having that data available should support faster deployment, and overall growth, of renewable energy projects.
“At Eaton, we see a vast opportunity for utilities to harness data to accelerate DER [distributed energy resource] integration and pave the way toward a low-carbon future,” said Beauchamp. “According to the U.S. Department of Energy, there was over 1,400 gigawatts of total generation and storage capacity in U.S. interconnection queues at the end of 2022, and projects spend approximately 3.7 years waiting in line. Regulatory bodies like the Federal Energy Regulatory Commission (FERC) and regional operations are looking to speed up interconnection requests.
Beauchamp continued, “An accurate and holistic digital model of the grid is essential to manage change and efficiently plan for the future, because these models create a foundation for utilities to simulate various scenarios and automate complex engineering processes. For example, every interconnection request requires a system impact study to establish the upgrades needed [and estimated costs] to support a safe, reliable electric grid. Traditionally, this type of analysis can take even highly trained and experienced engineers weeks or months to accomplish. Today, the combination of accurate system modeling and integration capacity analysis software tools are helping improve efficiency and reduce human error by allowing engineers to run simulations and automate system impact analysis in a matter of minutes.”
Software Platforms
Several companies, such as Pittsburgh, Pennsylvania-based Pearl Street Technologies, and Denver, Colorado-based Nira Energy, are developing software platforms designed to help solve those interconnection challenges. Pearl Street has worked with groups such as Southern Co. and the Southwest Power Pool to facilitate processes and studies around interconnection. Nira enables developers to have insight into the methodologies used by utilities and power grid operators when those groups determine grid interconnection points. Meanwhile, California-based PXiSE Energy Solutions, working with energy companies worldwide, provides grid control technology for DER and renewable energy integration.
GE Vernova’s Digital business in August of last year said it had acquired Greenbird Integration Technology AS, a data integration platform company focused on utilities. GE Vernova said the acquisition highlights the company’s “commitment to investing in technologies and talent that help accelerate the sustainable energy grid.” GE Vernova said it would use Greenbird’s technology and data integration experts to expand the capabilities of GE’s GridOS software portfolio for grid orchestration, “making it faster and easier to connect and integrate energy data across IT [information technology] and OT [operational technology] systems.”
At the time, Scott Reese, CEO of GE Vernova’s Digital division, said, “Utilities have an urgent need to connect data from multiple sources to gain visibility and effectively automate their grid operations. Fragmented data is a major obstacle to modernizing the grid and is holding the energy transition back. The Greenbird acquisition brings the proven ability to connect multiple data sources and accelerates our vision for GridOS that is making energy security a reality for many of the world’s leading utilities. Data and AI [artificial intelligence] are key to helping utilities run a reliable and resilient grid, and this acquisition is a massive accelerant to making that vision a reality for utilities of all sizes.”
GE Vernova said the GridOS orchestration software platform and application suite supports grid security and reliability, and provides resiliency and flexibility. The group said, “The software portfolio uses a federated data fabric to pull together energy data, network modeling, and AI-driven analytics from across the grid.” The company noted that the platform “connects modern software like Advanced Energy Management System (AEMS), Advanced Distribution Management Solutions (ADMS), and Distributed Energy Resource Management System (DERMS), creating new opportunities for grid automation.”
In a practical example, GE Vernova said the need for more connected and integrated data will come from the electric vehicle (EV) segment (Figure 3). As more EVs hit the road, the vehicles “will both draw from and possibly contribute to the grid as ‘rolling batteries’ that can be tapped when demand is high and supply is low. Integration of data from multiple sources like charging stations as well as operations, forecasting, billing, and other systems can support the success of future use cases such as vehicle-to-grid (V2G) while keeping the grid safe.”
Optimizing the Power Grid
Energy analysts agree that the U.S. power grid needs upgrades. Finding cost-effective solutions to improve infrastructure is a continuing challenge.
Otto Lynch, vice president and head of Power Line Systems at Bentley Systems, said data can be utilized to support decarbonization of power generation, and perhaps reduce the need to retire some thermal facilities that could still support the electricity supply.
“A digital twin of the electric grid could certainly lead to the complete decarbonization of electric power generation. However, this is not wise,” Lynch said. “As Texas, Germany, and several others have discovered, converting to completely [or mostly] renewable electric generation is a recipe for potential disaster; the wind doesn’t always blow, the sun doesn’t always shine, ice storms can freeze up windmills and cover solar panels. Like any wise investor, diversification is key.”
Lynch said thermal power plants should not be retired, but rather kept available for when demand for power exceeds supply. “Do not decommission a coal or a natural gas power generation facility,” Lynch said. “Keep it in reserve for when it is needed. While you can’t simply flip a switch and turn on a coal plant, proper planning could forecast when it may be needed and it could be ready in such events. Of course, having a digital twin of the grid and with real-time understanding of what the demand is along with what generation is available will lead to an effective balance of generation sources, which would lead to an overall reduction in carbon-based electric power generation.”
Lynch told POWER utilizing data can help power generators and grid operators overcome some of the limitations of infrastructure put in place decades ago. That includes, as noted earlier, the use of a “digital twin of the grid to direct power plant operators to efficiently generate electricity as customer demand fluctuates.” Lynch noted that “the U.S. electric grid is a patchwork of transmission and distribution lines built in many different entities” over the past century.
“All of it was built to different standards,” said Lynch. “Much of our grid was built more than 50 years ago, and at that time, they did not plan on moving as much power as we are today.” Lynch said, “Transmission lines are usually thermally limited. That is, they are designed for a simultaneous combination of ambient temperature, wind, sun angle [solar radiation], and electrical loading [amps]. The combinations of these factors determine the operating temperatures of the conductors. A high ambient temperature with a high electrical loading and a high solar radiation with very little wind will result in a very high operating temperature of the conductors.”
Lynch continued: “The higher the operating temperatures of the conductors, the more the conductors will sag [putting them closer to the ground, vegetation, and other obstacles]. Lines that were designed in the 1970s and before were mostly only designed for a 120F operating temperature. It is normal today to run transmission lines at 212F, 250F, and in some cases, even hotter. Prior to the 1970s, the National Electrical Safety Code [NESC] prescribed certain clearances to be met at the 120F design temperature. In the 1970s, the NESC reduced these minimum clearances but they were specified to be met at the actual maximum operating temperature of the conductors. This allowed more power to be transmitted over the existing transmission lines.”
Lynch told POWER, “Our grid [in the U.S. and worldwide] is currently operating at full capacities for the given constraints set forward by codes and laws. Rerouting this new source of renewable power back to where the demand [came] for the decommissioned power plants is not easy. Just like moving the fuse panel in your house from one end to the other, this requires a great amount of work.
“We are having to do what I have said many times over the past few decades, we have to basically ‘rewire America,’ ” he said. “In many cases, it will require new transmission lines. Many are proposed, but of course, they cannot be permitted fast enough. We will need to rerate many existing transmission lines. A digital twin of the entire grid will allow these decisions to be made quickly, effectively, and efficiently.”
Stability and Reliability
Managing power supply and demand is more challenging as more diverse generation resources are added to the grid (Figure 4). That challenge becomes greater as power producers and grid operators explore ways to improve the reliability and resiliency of generation assets, along with power lines and substations.
“Data is essential for maintaining a carefully calculated balance between supply and demand to ensure a stable and reliable grid supply. It’s the enabler for better management of the grid, especially when adapting to variable energy sources,” said McConnell. “Data fuels the analysis required to determine how much generation is predicted to be needed to meet a forecasted load. Regardless of location, data-driven insight is critical for preventing power outages and enhancing the overall efficiency and quality of the transmission and distribution aspects of the grid.”
Beauchamp told POWER, “Data is foundational to generate a complete, accurate grid model that enables effective long-term planning strategies for balancing energy demand and supply. These models can provide accurate insight into non-wires alternatives to reduce grid congestion, leverage DERs, manage peak demand to offset capital investments, and sustainably meet new energy demands.”
Beauchamp continued: “These critical capabilities hinge on the ability to gather intelligence at the edge of utility networks for localized processing, automation, and control—before unifying collected data using software that can support the modeling of new devices, power sources, microgrids, power electronics, and storage.”
“Data is key for predicting the availability of variable renewable energy sources such as solar and wind, which in turn allows for better planning of energy production,” said McConnell. “Data is critical when assessing weather patterns for example; renewable energy operators want to understand if there’s going to be a significant amount of wind or what cloud coverage looks like in regard to forecasted solar production. Data also helps guide the optimal use of batteries and other energy storage systems to ensure that renewable energy is used more effectively.”
McConnell noted the use of machine learning (ML) and AI also is expected to grow in the electric power sector. “Machine learning and artificial intelligence systems can also be leveraged to increase the precision and effectiveness of forecasted renewable energy resource availability outlooks, and these complex systems require data to generate predictive output results,” he said.
McConnell told POWER, “A decentralized approach allows distribution network participants to both use energy and produce it themselves to feed it back into the grid, or to, in some cases, even operate their own small microgrids. This approach contributes to an overall more flexible bulk electric system and also helps reduce reliance on large, centralized power plants. For example, when considering power plant outage scenarios, it can be advantageous to have distributed energy resources that can quickly and efficiently ramp up if a large power plant goes offline, supporting the load that would normally be served by the large power plant.”
Regulatory Environment
McConnell said data’s value also is enhanced as energy systems come under more federal, state, and local regulation. “Data is essential for regulatory compliance purposes,” he said. “Examples of this could be records an owner or operator is required to keep in order to prove they meet the NERC-CIP [North American Electric Reliability Corp.-Critical Infrastructure Protection] standards that are applicable to their facility type and impact level, or ongoing data recording required to meet specific jurisdictional ISO [International Organization for Standardization] requirements. Compliance data requirements vary depending on a variety of different factors and it’s important for power generation asset owners and operators to remain up to date with the latest requirements, making sure they are keeping the appropriate data to support regulatory compliance efforts in their organization.”
Beauchamp, whose company supports the power generation industry worldwide, said maintenance of the power grid will rely even more on data collection due to the impacts of extreme weather. “In North America, because of the wide diversity of its geography with large, less densely populated territories, most of the distribution grid is overhead and exposed to extreme weather events,” said Beauchamp. “As these extreme weather events become more frequent, it is especially important to fortify the electric grid. In fact, the White House recently announced the largest-ever investment in America’s electric grid to help U.S. utilities accelerate these types of weather-hardening projects.”
Beauchamp said that advanced self-healing smart grid technologies utilize real-time data to help utilities reduce the duration and impact of sustained power outages. “These types of data-driven enhancements provide utilities the world over with the valuable insights, reviewable event data, and the automation needed to enable more efficient operations, prevent costly failures, and reduce downtime. For example, feeder automation software precisely monitors fault location data to perform isolation and service restoration,” said Beauchamp. “This technology helps utilities automatically reconfigure or ‘heal’ the grid by detecting, isolating, and rerouting power—turning sustained outages into momentary ones.”
Mathias Burghardt, member of the executive committee and head of Infrastructure for Ardian, a France-based global investment group that works in the energy industry, in a recent report said, “Harnessing the data potential of our energy systems is imperative if we are to achieve the goals of the Paris Agreement and progress towards ‘Net Zero.’ Currently, only a minority of our electricity is supplied by renewables, but demand in the European Union is set to increase by 50% by 2050. Geopolitical crises such as the war in Ukraine have heightened the urgency of the energy transition, pushing investors and governments to seek solutions to provide citizens and businesses with clean, affordable, and secure energy.”
Burghardt explained, “Only by adopting a digital strategy and harnessing the potential of data can energy systems overcome the challenges between production, transmission, distribution, and consumption,” and said “digitization should be at the heart” of any discussions about investments in clean energy. “To enable such investments, regulatory changes will also need to be encouraged to promote innovation,” Burghardt said.
Schurr agreed, and said data collection is key for power generators as they navigate the energy transition. “The growing challenge for electric utilities is to manage three objectives concurrently: reliable power, affordable prices, and decarbonized energy supply,” he said. “Decentralized power generation must be better integrated into distribution planning and operations so that the advantages of dispatchable local supply, including bringing more renewables online and providing power closer to the point of use, can be exploited by utilities for the benefit of all customers. To achieve this integration, real-time communications are needed, but data about these distributed assets’ current state and potential operation can feed supply and delivery planning models and reduce the cost to meet reliability standards.”
—Darrell Proctor is a senior associate editor for POWER (@POWERmagazine).