A research team from the University of California, Santa Cruz, have used the Oak Ridge Leadership Computing Facility’s Summit supercomputer to run one of the most complete cosmological models yet to probe the properties of dark matter — the hypothetical cosmic web of the universe that largely remains a mystery some 90 years after its existence was definitively theorized.
According to the Lambda-cold dark matter (Lambda-CDM) model of Big Bang cosmology—which is the working model of the universe that many astrophysicists agree provides the most reasonable explanations for why it is the way it is — 85% of the total matter in the universe is dark matter. But what exactly is dark matter?
“We know that there’s a lot of dark matter in the universe, but we have no idea what makes up that dark matter, what kind of particle it is. We just know it’s there because of its gravitational influence,” said Bruno Villasenor, a former doctoral student at UCSC and lead author of the team’s paper, which was recently published in Physics Review D. “But if we can constrain the properties of the dark matter that we see, then we can discard some possible candidates.”
By producing over 1,000 high-resolution hydrodynamical simulations on the Summit supercomputer located at the U.S. Department of Energy’s Oak Ridge National Laboratory, the team modeled the Lyman-Alpha Forest, which is a series of absorption features formed as the light from distant bright objects called quasars encounters material along its journey to Earth. These patches of diffuse cosmic gas are all moving at different speeds and have different masses and extents, forming a “forest” of absorption lines.
The researchers then simulated universes with different dark matter properties that affect the structure of the cosmic web, changing the fluctuations of the Lyman-Alpha Forest. The team compared the results from the simulations with the fluctuations in the actual Lyman-Alpha Forest observed by telescopes at the W. M. Keck Observatory and the European Southern Observatory’s Very Large Telescope and then eliminated dark matter contenders until they found their closest match.
Consequently, the team’s results were contrary to the Lambda-CDM model’s primary contention that the universe’s dark matter is cold dark matter—hence the model’s abbreviation, which references dark matter’s slow thermal velocities rather than its temperature. Instead, the study’s top prospect indicated the opposite supposition: We may indeed be living in a universe of warm dark matter, with faster thermal velocities.
“Lambda-CDM provides a successful view on a huge range of observations within astronomy and cosmology. But there are slight cracks in that foundation. And what we’re really trying to do is push at those cracks and see whether there are issues with that fundamental foundation. Are we on solid ground?” said Brant Robertson, project leader and a professor at UCSC’s Astronomy and Astrophysics Department.
Beyond possibly unsettling a few long-held assumptions about dark matter (and the universe itself), the UCSC project also stands out for its computational feat. The team accomplished an unprecedentedly comprehensive set of simulations produced with state-of-the-art simulation software that accounts for the physics that shape the structure of the cosmic web and leverages the computational power of the largest supercomputers in the world.
The History of the Universe, Pt. 0
One of the main ingredients of the universe is dark matter. Astrophysicists are certain that without dark matter, there would be no galaxies, planets, or life as we know it. Consider dark matter to be the foundation of the cosmic web—because of its gravitational influence, the gas of the intergalactic medium will fall into the dark-matter clumps, where it forms stars, and those stars create galaxies
However, there is a sizable information gap in this seemingly straightforward theory: scientists have yet to verify the existence of dark matter. It doesn’t interact at all with electromagnetic radiation—such as radio waves, microwaves, or visible light (hence the name)—making it extremely difficult to analyze from Earth.
Because we don’t know what particles comprise dark matter, its fundamental properties are also unknown. So, even its presence in the universe is hypothetical. But scientists have concluded that dark matter must exist because some form of matter in the universe is interacting with normal matter through gravity. Without dark matter, they cannot explain the velocity distributions in galaxies, for example.
For cold vs. warm dark matter, one way to distinguish between them is by studying the clumpiness of the cosmic web. In a universe composed of cold dark matter, gravity leads to the formation of many small dark-matter clumps. In contrast, if dark matter is warm, the larger velocities of its particles will prevent the formation of small clumps, resulting in a cosmic web that is smoother compared to the Lambda-CDM scenario.
“This is why the Lyman-Alpha Forest is very important for our study. When light travels through the universe, it interacts with gas, which follows the dark matter distribution. Because of that, we can infer some properties of dark matter from the Lyman-Alpha Forest,” said Villasenor, who graduated from UCSC last year and is now working at AMD as a high-performance applications engineer.
The UCSC team’s simulations on Summit (supported by the DOE’s Innovative and Novel Computational Impact on Theory and Experiment program) revealed a slight preference for a warm dark matter universe; the properties of warm dark matter’s Lyman-Alpha Forest spectra better fit the properties of the observations. But that means the properties of a cold dark matter universe also fit the data fairly well, so the project’s findings won’t overturn current cosmological models. They should inspire more research, however.
“What this is telling us is that we need better observations. This data is not enough to conclusively determine whether we live in a cold dark matter universe or in a warm dark matter universe,” Villasenor said. “This motivates us to make more observations of these quasars at higher resolutions.”
Expanding the Universe of Cholla
The UCSC team used a GPU-optimized hydrodynamics code called Cholla (Computational Hydrodynamics On ParaLLel Architectures) as the starting point for its simulations on Summit. Developed by Evan Schneider, an assistant professor in the University of Pittsburgh’s Department of Physics and Astronomy, Cholla was originally intended to help users better understand how the universe’s gases evolve over time by acting as a fluid dynamics solver. However, the UCSC team required several more physics solvers to tackle its dark matter project, so Villasenor integrated them into Cholla over the course of three years for his doctoral dissertation at UCSC.
“Basically, I had to extend Cholla by adding some physics: the physics of gravity, the physics of dark matter, the physics of the expanding universe, the physics of the chemical properties of the gases, and the chemical properties of hydrogen and helium,” Villasenor said. “How is the gas going to be heated by radiation in the universe? How is that going to propagate the distribution of the gas? These physics are necessary to do these kinds of cosmological hydrodynamical simulations.”
In the process, Villasenor has assembled one of the most complete simulation codes for modeling the universe. Previously, astrophysicists typically had to choose which parameters to include in their simulations. Now, combined with the computing power of Summit, they have many more physical parameters at their disposal.
“One of the things that Bruno accomplished is something that researchers have wanted to do for many years and was really only enabled by the supercomputer systems at OLCF: to actually vary the physics of the universe dramatically in many different ways,” Robertson said. “That’s a huge step forward—to be able to connect the physics simultaneously and do that in a way in which you can compare them directly with the observations.
“It just hasn’t been possible before to do anything like this. It’s orders of magnitude, in terms of computational challenge, beyond what had been done before.”
Schneider, who advised Villasenor on his work to extend Cholla, said she thinks his additions will be “totally critical” as she prepares Cholla for her own simulations on the new exascale-class Frontier supercomputer, which is housed along with Summit at the OLCF, a DOE Office of Science user facility at ORNL. She is leading a project through the Frontier Center for Accelerated Application Readiness program to simulate the Milky Way galaxy and will be using some of the solvers added by Villasenor.
“Astrophysics software is very different than other kinds of software because I don’t think there’s ever any sort of ultimate version, and that certainly isn’t the case for Cholla,” Schneider said. “You can think of Cholla as being a multitool, so the more pieces we add to our multitool, the more kinds of problems we can solve. If I built the original tool as just a pocketknife, then it’s like Bruno’s added a screwdriver—there are a whole class of problems we can solve now that we couldn’t address with the original code. As we keep adding more and more things, we’ll be able to tackle more and more complicated problems.”
Related Publication: B. Villasenor, et. al. “New Constraints on Warm Dark Matter from the Lyman-α Forest Power Spectrum,” Physics Review D (2023): https://doi.org/10.1103/PhysRevD.108.023502.
UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.