This story was originally written by Aaron Dubrow at the Texas Advanced Computer Center and adapted by Rachel McDowell for the Oak Ridge Leadership Computing Facility.
A randomly selected 3,000-year segment of the physics-based simulated catalog of earthquakes in California, created on the Texas Advanced Computing Center’s Frontera supercomputer. Video Credit: Kevin Milner, University of Southern California
A team led by researchers at the Southern California Earthquake Center (SCEC) at the University of Southern California has used the Oak Ridge Leadership Computing Facility’s (OLCF’s) Summit, the most powerful and smartest supercomputer in the nation, to simulate the impact of large earthquakes at 10 different sites across California.
Using the prototype Rate-State earthquake simulator (RSQSim), which models hundreds of thousands of years of seismic history in California, coupled with a computational application called CyberShake, the team completed major ground-motion simulations on Summit that compare well to historical earthquakes and the results of other methods—and display a realistic distribution of earthquake probabilities. The results of the study were published in the Bulletin of the Seismological Society of America.
“We haven’t observed most of the possible events that could cause large damage,” explained Kevin Milner, a computer scientist and seismology researcher at SCEC who led the study with Columbia University’s Bruce Shaw.
The rarity of massive earthquakes is the precise reason why determining risks for specific locations or structures is challenging. Researchers take a multifaceted approach to studying earthquakes in an effort to bridge the gaps in earthquake data. Scientists can open trenches across earthquake faults to study past earthquakes, create hazard models from previous earthquake data, and use supercomputers to simulate a specific earthquake in a specific place with a high degree of fidelity.
The combination of the two codes, however, offers a novel framework for predicting the likelihood and impact of earthquakes over an entire region and many seismic cycles. Developed by researchers at SCEC over the past decade, the framework helps researchers determine which locations might experience the greatest impact from an earthquake, allowing building code developers, architects, and structural engineers to design more resilient buildings that can survive earthquakes at a specific site.
“For the first time, we have a whole pipeline from start to finish where earthquake occurrence and ground-motion simulation are physics-based,” Milner said. “It can simulate up to hundreds of thousands of years on a really complicated fault system.”
The team initially used RSQSim on the Frontera supercomputer, the National Science Foundation’s (NSF’s) leadership-class national resource at the Texas Advanced Computer Center, to simulate over 800,000 virtual years how stresses form and dissipate as tectonic forces act on the Earth in regions of California. Using Frontera, as well as the Blue Waters system at the National Center for Supercomputing Applications, the team generated a massive catalog of earthquake data—a record that an earthquake occurred at a certain place with a certain magnitude and attributes at a given time. The catalog required 8 days of continuous computing on Frontera and was among the largest ever made, said Christine Goulet, Executive Director for Applied Science at SCEC.
Under an Innovative and Novel Computational Impact on Theory and Experiment, or INCITE, allocation of computer time, the team used the OLCF’s 200-petaflop Summit supercomputer to feed outputs of RSQSim into the CyberShake simulator, which again used computer models of geophysics to predict how much shaking (in terms of ground acceleration, or velocity, and duration) would occur as a result of each quake. Summit is the flagship system of the OLCF, a US Department of Energy (DOE) Office of Science User Facility located at DOE’s Oak Ridge National Laboratory.
“The framework outputs a full slip-time history: where a rupture occurs and how it grew,” Milner explained. “We found it produces realistic ground motions, which tells us that the physics implemented in the model is working as intended.”
The work is helping to determine the probability of an earthquake occurring along any of California’s hundreds of earthquake-producing faults, the scale of earthquake that could be expected, and how it may trigger other quakes.
The team has more work planned for validation of the results, which is critical before acceptance for design applications.
“The hope is that these types of models will help us better characterize seismic hazard so we’re spending our resources to build strong, safe, resilient buildings where they are needed the most,” Milner said.
Support for the project comes from the US Geological Survey, NSF, and the W. M. Keck Foundation.
Related Publication: Kevin R. Milner, Bruce E. Shaw, Christine A. Goulet, Keith B. Richards‐Dinger, Scott Callaghan, Thomas H. Jordan, James H. Dieterich, and Edward H. Field, “Toward Physics‐Based Nonergodic PSHA: A Prototype Fully Deterministic Seismic Hazard Model for Southern California.” Bulletin of the Seismological Society of America (2021). doi:10.1785/0120200216.
UT-Battelle LLC manages Oak Ridge National Laboratory for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.