Skip to main content

During its final hours of operation, the Titan supercomputer simulated the birth of supernovae

Titan, the groundbreaking Cray XK7 supercomputer operated by the Oak Ridge Leadership Computing Facility (OLCF) at the US Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL), was officially decommissioned on August 1.

The petascale machine ran countless simulations over its 7 years of service, and its sheer computational power was consistently in demand by researchers. But for a brief window, just prior to Titan’s decommissioning, only one simulation was running.

This simulation—the last to ever occupy the supercomputer—examined the final moments of a star’s life.

When stars like our Sun run out of fuel, they become red giants and then later white dwarfs. Larger stars, those at least 10 times more massive than the Sun, collapse into an extremely dense neutron star before launching a massive shockwave in the form of a supernova explosion.

University of Tennessee astrophysicist Eric Lentz, the last user of Titan, answered a few questions about the supercomputer’s impact on his project, as well as what the future holds for his research.

What were you simulating on Titan?

Eric Lentz

University of Tennessee astrophysicist Eric Lentz. Image credit: Rachel McDowell

Lentz: We’ve been modeling core-collapse supernovae—explosions of massive stars. The state of an exploding star is consistently changing. It starts as a relatively low-density white dwarf-like iron core, and in the first second of collapse increases in density by about four orders of magnitude. These simulations are rather computationally expensive, as most things on Titan tend to be, and we don’t get a lot of opportunities to do a lot of runs; one or two full simulations in an allocation is the limit of what we’ve generally been able to do.

The simulations are part of a project to explore the variations and inputs that go into core collapse, which means the things that tend to influence the nature of stars like their composition and their mass. A lower-mass star like our Sun has a very different fate than a star that has, say, 10 or 20 times its mass at the beginning. The types of explosions we’re looking at, which have an important impact on developing galaxies by injecting newly made elements into them, come in a fairly wide variety of initial conditions themselves. Among the simulations we were running during the final month or so on Titan were models representing stars that had about 25, 15, and 10 times the mass of the Sun.

What has Titan meant for your research?

Lentz: We’ve been on Titan sort of from the beginning of general access. I think the biggest change over time has been the consequence of the available node and processor count. Titan has allowed us to really, properly capture the resolution needed for 3D runs, particularly with codes like ours by which the physics evolves during the simulation.

The overall growth of these top-end supercomputers like Titan makes the complex, 3D simulations that we’ve been doing possible. In particular, on machines like Titan we’ve taken advantage of the node count and that’s been really critical. To be able to run on 20,000, 30,000, or 40,000 cores for 1,000 hours per run is an extremely rare opportunity. It’s just not something that’s readily available to most researchers; it’s not a level of computational power you can get on a departmental or institutional cluster, so without machines like Titan, some of this work would not be possible at all, literally. We’ve been very pleased with the things we’ve been able to do on Titan over the years, and it has whetted our appetite to do more.

Can you tell me about the results of the last simulation?

Lentz: While we’re still analyzing the data from that run, we can see how significantly it shows the volatile nature of massive stars and how they collapse and explode. It also shows how much more, frankly, we need to run to be able to keep approximating the physics of core collapse more precisely. Studying that final simulation has also really been helping guide us forward as we prepare for the jump to Summit.

What does the work look like on the road to Summit?

Lentz: Part of the dilemma we have in designing a simulation is balancing how well resolved it is with how long it takes to run. I think one of the unfortunate aspects of our prior 3D work has been the 1,000-hour run time just to get an explosion started, not even to get it fully developed to an asymptotic state.

Unfortunately, all of our simulations are wrapped up in one single model, which means we can’t really isolate individual aspects for smaller runs. We’ve been focused on doing large-scale simulations and incorporating all the physics as best as we can approximate, and then continuing to improve those approximations and the overall performance of the simulation. However, this sort of leaves us in an awkward position where, to run simulations in 3D, we need a large computer to run for a long time.

So, we’ve slightly reoriented ourselves looking towards Summit. We came to a really happy place that lets us run more simulations that are slightly less resolved, but that will allow us to run longer and explore more of the physics.

What are some of your major goals for running on Summit?

Lentz: Right now, we’re focusing on a few more final improvements to our code to meet our goals for next year, and there is a little speedup left to do. With the combination of Summit’s GPU components and improvements to our code, I think we can get our run times much shorter. So, with Titan we would run simulations modeling a half second or three-quarters of a second of core collapse, and those would be about 1,000-hour runs. With Summit, we’re aiming at running a full second in 700 or 800 hours.

Our code is relatively mature in that we’ve run simulations fairly long in 2D, so a lot of the basic physics challenges have been captured—but there’s always the potential for the 3D parts to catch it. I think we’ve done a total of eight or nine 3D runs that have gone relatively long, and we have many more 2D runs that have gone much longer. This is what we’re hoping to do next year with Summit: to take 3D runs much longer than we have before.

UT-Battelle LLC manages Oak Ridge National Laboratory for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.