PI: Jordan M. Musser,
National Energy Technology Laboratory
In 2016, the Department of Energy’s Exascale Computing Project set out to develop advanced software for the arrival of exascale-class supercomputers capable of a quintillion (1018) or more calculations per second. That leap meant rethinking, reinventing, and optimizing dozens of scientific applications and software tools to leverage exascale’s thousandfold increase in computing power. That time has arrived as the first DOE exascale computer — the Oak Ridge Leadership Computing Facility’s Frontier — opened to users around the world. “Exascale’s New Frontier” explores the applications and software technology for driving scientific discoveries in the exascale era.
The Science Challenge
Although the U.S. is adding renewable energy sources to its power grid at an increasing pace, fossil fuels still generate much of the country’s electricity. Currently, about 60% of the grid’s power comes from fossil fuels, and, although that number is decreasing, it will still be as high as 44% by 2050 according to projections from the U.S. Energy Information Administration. This means that carbon capture and storage technologies must advance quickly to help reduce carbon dioxide emissions from fossil-fuel power plants. One of the biggest hurdles is how to scale up laboratory designs of reactors with carbon capture and storage technology to industrial proportions. Simply enlarging the designs to “life size” isn’t a reliable way of anticipating potential flaws, and building increasingly larger examples for testing is cost prohibitive. What’s needed to solve this problem is a suite of high-fidelity computational tools that can simulate the physics at work in emerging carbon capture and storage technologies, predict the viability of different designs at scale, and enable their preparation for commercial use.
Why Exascale?
MFIX-Exa is a scalable computational fluid dynamics-discrete element method model. It models both fluids and particles by tracking individual particles within a continuum fluid phase. This type of flow is often found in industrial reactors, especially in carbon capture applications, and will be an integral part of proposed chemical looping reactors, or CLRs. With physics models imported from the National Energy Technology Laboratory’s workhorse MFIX code and optimized for GPU-accelerated exascale supercomputers, MFIX-Exa takes computational fluid dynamics-discrete element method modeling for these reactors from laboratory bench scale to commercial pilot scale and beyond.
“We simulated a small-scale CLR, which is only a 50-kilowatt thermal unit. And that system alone has 5 billion particles in it. So, if we were to look at commercializing these systems to contain possibly trillions or hundreds of trillions of particles, then the ability to model that requires an enormous amount of compute capability. And with MFIX-Exa, we can make full use of the GPU systems at all the DOE leadership computing facilities,” said Jordan M. Musser, a NETL physical research scientist and MFIX-Exa project leader.
Frontier Success
In CLRs, multiple reactors are used to separate combustion into two distinct processes: oxidation and reduction. In one reactor, particles take up oxygen from air, thereby producing heat. The oxidized particles are then transported to a second reactor in which they are reduced by a fuel. Because contaminants such as nitrogen are kept out of the second reactor, the carbon dioxide produced can be collected for use or storage without employing costly separation processes. Finally, particles are passed back to the first reactor to be oxidized again, completing the loop. The MFIX-Exa team’s ECP challenge was to model a full-loop CLR that contained billions of particles and to include interphase momentum, mass, and energy transfer between the fluid and particles in the model.
The team successfully modeled NETL’s 50-kilowatt thermal CLR at a resolution within the reactor of 400 cubic microns with almost a billion computational cells used to represent the fluid and 5.2 billion particles to represent every particle within that system. The simulation scaled across 82% of Frontier’s GPUs.
What’s Next?
Musser and his team are working with program managers at the DOE’s Office of Fossil Energy and Carbon Management to continue developing MFIX-Exa and to add capabilities that address point-source carbon capture and direct-air capture of carbon dioxide. They are also partnering with Lawrence Berkley National Laboratory to conduct research within the DOE’s High Performance Computing for Energy Innovation program, which grants manufacturers access to high-performance computing facilities and their researchers for advanced R&D.
However, it all started with the ECP.
“There is no doubt in my mind that we would not have been able to achieve this accomplishment without ECP support — not only the financial support but also the ecosystem where they brought together different labs and different research groups to tackle these problems as co-units with one another,” Musser said.
Support for this research came from the Exascale Computing Project, a collaborative effort of the DOE Office of Science and the National Nuclear Security Administration, and the DOE Office of Science’s Advanced Scientific Computing Research program. The OLCF is a DOE Office of Science user facility.
UT-Battelle LLC manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.