Skip to main content

Modernizing power-grid management for a new era of energy sources

PI: Christopher S. Oehmen,
Pacific Northwest National Laboratory

In 2016, the Department of Energy’s Exascale Computing Project (ECP) set out to develop advanced software for the arrival of exascale-class supercomputers capable of a quintillion (1018) or more calculations per second. That leap meant rethinking, reinventing, and optimizing dozens of scientific applications and software tools to leverage exascale’s thousandfold increase in computing power. That time has arrived as the first DOE exascale computer — the Oak Ridge Leadership Computing Facility’s Frontier — opened to users around the world. “Exascale’s New Frontier” explores the applications and software technology for driving scientific discoveries in the exascale era. 

The Science Challenge

As more renewable sources of energy are added to the national power grid, it becomes more complex to manage. That’s because renewables such as wind and solar rely on forces of nature that behave unpredictably. Combined with the potentially adverse effects of extreme weather events or cyberattacks, these factors introduce new uncertainties into the power grid that may cause instability. Current programs that help operators manage power grids are not inherently built to consider weather uncertainty and cannot evaluate sudden changes in power supplied from renewable sources, leaving power grids at risk.

Why Exascale?

To tackle this widening gap in power-grid management, the ExaSGD project has developed a software stack that uses the power of GPU-accelerated computers to assess and predict a far wider range of possibilities than current management programs can.

The software stack consists of ExaGO, a power-system modeling framework that uses high-fidelity grid models to analyze optimal AC power flows on new and emerging architectures; HiOp, a high-performance optimization solver that improves a power grid’s response to disruptive weather events; and the Wind Integration National Dataset Toolkit, which includes instantaneous meteorological conditions from computer model output and calculated turbine power for more than 126,000 sites in the continental United States.

Leveraging the power of exascale-class supercomputers, ExaSGD produces much larger, more complex models of power-grid systems and their potential disruptors than current software on commodity computing hardware. This allows ExaSGD to formulate much more precise solutions to keep electricity flowing and it arrives at them sooner. And now that the codes have been optimized for GPU-accelerated computers, the ExaSGD software stack can also apply these advantages to systems on a much smaller scale in terms of both computing systems and power grids.

“We harnessed the power of the world’s fastest computing components to solve the core problem so that we could take the most advantage of the kind of computers that we expect to see in the future,” said Christopher S. Oehmen, ExaSGD project leader and a senior research scientist at Pacific Northwest National Laboratory. “When people want to solve smaller versions of this problem for themselves, not on an exascale system, they still get the benefit of having a mini-supercomputer because the algorithm will actually run on their GPU hardware accelerator.”

Image Credit: Carlos Jones, ORNL

Frontier Success

For ExaSGD’s challenge-problem runs on Frontier, the team created a high-fidelity model of a power grid, including electrical generation, transmission, load and cyber/control elements. Then the model underwent random contingency scenarios for the ExaSGD software stack to evaluate and compute solutions for reaching the grid’s optimal operating point. The project also needed to outperform the North American Electric Reliability Corporation’s operating standard of a 30-minute short-term response.

In the largest simulation of its kind to date, the 10,000-bus grid model faced 10,000 failure contingencies and 10 random scenarios for a total of 105 scenario-contingency pairs, running on 9,000 Frontier nodes. The challenge problem required a minimum of 10,000 optimization parameters — ExaSGD exceeded this threshold to solve for 26,786. And the team delivered a short-term response of 16 minutes, beating the 30-minute requirement.

“We built something from the ground up, and we could have only done that with ECP support. This is not an application that was in the leadership-class computing’s radar before now. This is the first time we’ve ever been part of this conversation, and so we went from not having any prior HPC applications, at least not in this area, to having the world’s first. So that is a brand-new capability that did not exist before the ECP’s investment, and we literally would not be here without it,” Oehmen said.

What’s Next?

The ExaSGD project’s next steps will be to find power-industry partners to test the software stack’s capabilities and compare them with their current grid-management programs. Oehmen foresees ExaSGD becoming a valuable decision-support tool for power-grid operators and planners over the next 10 years at whatever level of computer power they have available.

“It would be a huge success for us if we had people running it on laptops with GPUs — even at that level, they’re going to be able to see a lot more than they can see today,” Oehmen said. “With larger systems and more time, we would be able to look at all sorts of things. What happens when a hurricane lands? What happens if global climate change leads to even more extreme weather? There are all sorts of questions that you can’t answer today without this kind of big computing, and those are the things that we want to explore more and more going forward.”

Support for this research came from the ECP, a collaborative effort of the DOE Office of Science and the National Nuclear Security Administration, and from the DOE Office of Science’s Advanced Scientific Computing Research program. The OLCF is a DOE Office of Science user facility.

UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit https://energy.gov/science.

Coury Turczyn

Coury Turczyn writes communications content for the Oak Ridge Leadership Computing Facility (OLCF). He has worked in different communications fields over the years, though much of his career has been devoted to local journalism. In between his journalism stints, Coury worked as a web content editor for CNET.com, the G4 cable TV network, and HGTV.com.