Bronson Messer marshals computational scientists around the world to launch their exascale-ready codes on Frontier
The “Pioneering Frontier” series features stories profiling the many talented Oak Ridge National Laboratory employees behind the construction and operation of the Oak Ridge Leadership Computing Facility’s incoming exascale supercomputer, Frontier. The HPE Cray system was delivered in 2021 and is now being prepared for full user operations..
Answer: This 2003 Jeopardy! champion honestly does not remember his final “Final Jeopardy!”
Question: Who is Bronson Messer?
“I think it was something about US presidents. I think,” said Messer, who serves as director of the Frontier Center for Accelerated Application Readiness (CAAR) as well as the director of science for the Oak Ridge Leadership Computing Facility (OLCF). “I do remember several of the Final Jeopardy! subjects because one of them was about a Broadway musical, Flower Drum Song, and I had no idea. I remember that distinctly.”
Despite this concerning lack of firsthand Jeopardy! trivia knowledge, Messer is a highly accomplished scientist and administrator at the OLCF, a US Department of Energy (DOE) Office of Science User Facility located at Oak Ridge National Laboratory (ORNL). He’s a Distinguished Scientist at ORNL, part of a team that won a 2022 R&D 100 Award, and a 2022 HPCwire Person to Watch.
Messer is also the person most responsible for making sure that the OLCF’s next-generation exascale supercomputer, Frontier, will have groundbreaking scientific inquiries to conduct once it goes into production.
For the past 4 years, as the head of CAAR, Messer has been working with teams of computational scientists from around the world to prepare their project codes for the new supercomputer, which is capable of 2 exaflops (a billion-billion floating point operations per second) of theoretical peak performance. That’s up to 10 times faster than today’s top supercomputers. Frontier also uses a new architecture: an HPE Cray EX system with AMD EPYC™ CPUs and AMD Instinct™ GPUs. Adapting project codes to work on a future system whose specs were still being finalized did not distress him—and it still doesn’t.
“Things not working on day 1? Not only am I not scared about it, I expect it. But, by day 5, I’m confident that things will work because we have a good understanding of what’s going on in the codes and how to fix things,” Messer said. “I’m more concerned that everybody gets the right answers and that they see a marked reduction in the total time to scientific insight. I’m ultimately most excited for them to just get through those inevitable problems, because that’s just what you do.”
Such confidence has been earned. This is Messer’s second stint as CAAR lead—part of a long history of taking on managerial duties throughout his science career. After earning his PhD in physics from the University of Tennessee, Knoxville in 2000, Messer became a postdoctoral research associate at ORNL. In 2003, Messer joined the University of Chicago as a research associate at the Flash Center for Computational Science. Finally, in 2005, he returned to ORNL as an R&D associate and has held numerous titles since then, including acting director of science for the National Center for Computational Sciences at ORNL, acting group leader for the lab’s Scientific Computing Group, and section head for Science Engagement.
As a scientist, Messer has primarily focused on the explosion mechanisms and phenomenology of supernovae (both thermonuclear and core-collapse), especially neutrino transport and signatures, dense matter physics, and the details of turbulent nuclear combustion. That might not appear to be the sort of scientific background that leads to management. But being a computational astrophysicist means working with huge simulation codes—and that necessitates teamwork.
“It literally takes a village to build a multi-physics or stellar astrophysics code. I think I knew from my earliest days as a grad student that one of the most important ways to get stuff done was to actually be able to marshal people. I’d say it’s a natural extension of the way I’ve done science—and especially team science—throughout my career,” Messer said.
Guiding disparate teams in far-off locations toward computational functionality isn’t as easy as it sounds. Researchers typically care about one thing: obtaining the answer to their question as quickly as possible. The CAAR program is attempting to determine how fast that answer can be produced on Frontier by measuring the increase in predetermined figures of merit for each of the CAAR codes. How does Messer keep the CAAR teams focused on these metrics?
“You have to set the conditions where everybody is enthusiastic about doing what they’re doing. Beyond that, I try to be a facilitator of solutions to problems that may be impeding their progress on those milestones and ultimately to the final answer,” Messer said. “As hackneyed as it sounds, giving people a sense of what the final mission is and then getting out of their way is my overarching philosophy.”
Given the number of OLCF supercomputer launches he’s been part of—beginning with Jaguar in 2005 when he ran the FLASH code on 5,000 cores—Messer isn’t as giddy about Frontier’s thousand-fold leap in computing power on day 1 as he is over Frontier’s long-term impact on science.
“As exciting as it has been to get to exascale, it’s even more exciting that we’re going to have an incredibly capable and durable platform for doing computational science for several years to come,” Messer said. “That’s how science actually works—it’s incremental. As much as we talk about huge phase changes when we get something like exascale capability, we really push forward the frontiers of knowledge in most fields through incremental hard work and building upon earlier understanding.”
UT-Battelle LLC manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.