Skip to main content

Resolution Revolution

By November 7, 2008April 3rd, 2013Science10 min read

This image was produced from a Carbon-Land Model Intercomparison Project (C-LAMP) simulation performed as part of a SciDAC2 project on NCCS supercomputers. Scientific visualization by Jamison Daniel, ORNL/NCCS

Software engineers turn to Pat Worley’s team to fix problems that threaten simulations.

Everyone has questions about climate change. Is it a good idea to convert forests and food croplands to produce plants for biofuels? What technologies best capture and store carbon? How intense will hurricanes and heat waves get? Will release of methane trapped in permafrost accelerate climate change?

Getting answers increasingly depends on climate simulations, which set in motion a digitized world that mirrors our past and present and probes our future. That world would not turn without software applications such as the Community Climate System Model (CCSM), a megamodel coupling four independent models whose codes describe Earth’s atmosphere, oceans, lands, and sea ice.

Such simulation tools run on supercomputers like those of the National Center for Computational Sciences (NCCS) at Oak Ridge National Laboratory (ORNL). Depending on those models is a global community of scientists representing, in the United States alone, organizations including the Department of Energy (DOE), National Center for Atmospheric Research (NCAR, funded by the National Science Foundation), National Oceanic and Atmospheric Administration, National Aeronautics and Space Administration, Environmental Protection Agency, and various national security agencies.

The Fourth Assessment Report (AR4) of the Intergovernmental Panel on Climate Change (IPCC) utilized simulations done in 2004 and 2005. Using the latest version of the CCSM, researchers carried out computations on resources at the NCCS, NCAR, National Energy Research Scientific Computing Center, and Japanese Earth Simulator. John Drake, a scientist working at the intersections of computer science, climate science, and applied mathematics, called the simulations “a watershed event for climate science and for the way in which we provide computational simulation results to the community.” Drake leads ORNL’s contribution to the Climate Science Computational End Station, which will support the IPCC’s fifth assessment report, due in 2014.

“A fairly small team has been building the models and even a smaller team performs the simulations and posts the material on the Earth System Grid for others to retrieve,” he said. “Very few sites in the world can field the kind of computational horsepower that the NCCS does and that various other large climate and weather centers have internationally. The fact that you can perform these simulations and then make the results very quickly available to the university researchers and people who don’t have access to the machines or wherewithal to build the models multiplies the productivity of the science enterprise.”

The proof is in the papers: In the months after simulation data were posted, scientists produced roughly 300 journal articles. The IPCC cited the studies in AR4, which concluded planetary warming during the twentieth century was probably the result of human activities. In 2007 the IPCC shared the Nobel Peace Prize with Al Gore for its efforts.

Weather versus climate

How can we simulate climate 100 years from now if we don’t know the weather 100 days from now? “Climate is statistical, or average, weather,” explained Drake. “It can tell you if hurricanes will be more likely, less likely, or stronger, but it can’t tell you when they will occur.”

With weather, small changes early on cause large changes later. “You can’t do weather forecasting beyond 10 to 15 days because it’s based on chaos theory,” Drake said. “In Earth system modeling and climate studies, we’re always aware of the effect of chaos—the butterfly flaps its wings, and that changes the path weather is going on, the fundamental dynamics of the atmosphere. Some people would throw up their hands and say, ‘You can’t do this problem.’”

But if scientists study climate from a statistical standpoint—in essence sampling multiple flaps of the butterfly’s wing—they can approach the problem. “An ensemble of paths is then averaged to get the most likely path,” Drake said.

Just how well do statistics model reality? “To the extent that we can reproduce the paleoclimate record or recent historical record, if emission scenarios and forcings are accurate, then we believe the models are reasonably accurate,” said NCCS Director James J. Hack, a renowned climate researcher who implements global models on high-performance computing systems. “But we know that they’re not reliable on smaller than subcontinental space scales. In fact, on space scales similar to the North American continent, there’s divergence in the models about what happens to precipitation over North America 100 years from now.”

To answer questions about climate change at local levels, such as what’s going to happen in the Tennessee Valley in a decade, scientists need higher-resolution models. “We want to employ numerical algorithms that can scale to use many, many, many more processors and keep the time to solution about the same,” Hack said.

“To evaluate local or regional impacts of climate change, the computational requirements for climate modeling go way up,” said ORNL computational scientist Patrick Worley. “The models are currently not able to efficiently exploit the computing resources that will be available in the near future. To take advantage of those, we need to modify some of the models fairly dramatically.”

Performance police

When scientists want more accurate or more detailed simulations, they turn to modeling experts and software engineers who upgrade the capabilities of the simulation models. When the software engineers need help, they turn to Pat Worley. He leads a DOE Scientific Discovery through Advanced Computing (SciDAC) project with Arthur Mirin of Lawrence Livermore National Laboratory and Raymond Loy of Argonne National Laboratory to scale up climate codes, enabling them to solve larger problems by using more processors and to evaluate software and new high-performance computing platforms such as the Cray XT4 and IBM Blue Gene/P supercomputers.

“An important practical aspect of climate science is figuring how much science you can get in the model and still get the simulations done in time,” said Worley, whose team works with researchers and manufacturers to identify bugs in CCSM codes, performance bottlenecks in the algorithms used in the CCSM, and glitches in a machine’s software. “Our contribution is getting the component models to run as efficiently as possible. The software engineering aspects of the code are always changing, and often the new code has unexpected performance issues. We monitor things. We’re kind of the performance police.”

Worley and his colleagues push codes to their limits. If a code runs slowly on 1,000 processors but quickly on 2,000, they might assign more processors to work on a problem. If, due to algorithmic restrictions, the code can’t use more than 1,000 processors, changing algorithms may be the only option to improve performance. Different science also imposes different performance requirements. Ocean scientists may choose to run a high-resolution ocean model coupled to a low-resolution atmosphere model, whereas atmospheric scientists may pick the converse. Changes to the codes to improve performance for one scenario must not slow down the code for another or hurt performance on a different (or future) platform. Often the performance team introduces algorithm or implementation options that scientists can chose to optimize performance for a given simulation run or on a particular computer system.

On Cray and IBM systems, the group has improved performance through both algorithmic and implementation efforts. Recent work improved performance 2.5-fold on benchmark problems on ORNL’s Cray XT4. “With the improvements to the scalability of the CCSM software by Pat and his colleagues, along with the dramatic growth in the performance of Jaguar, the CCSM developers are seriously considering model resolutions and advanced physical processes that were not on the table before,” said Trey White, who as NCCS liaison to the CCSM project helps the scientists make the most of the machines.

“Pat Worley’s group has provided critical support in improving the scalability and performance of the CCSM across a wide range of architectures,” said NCAR’s Mariana Vertenstein, head of the engineering group responsible for CCSM’s software development, support, and periodic community releases. “The CCSM project played a major role in the IPCC AR4 through an extensive series of modeling experiments and in fact resulted in the most extensive ensemble of any of the international global coupled models run for the IPCC AR4. This accomplishment could not have occurred without Pat’s contributions.”

Worley’s team is currently working with a large multilab SciDAC project led by Drake with Phil Jones of Los Alamos National Laboratory to build a first-generation Earth System Model, which will extend the physical climate model by including chemical and ecological processes. The computer allocations are provided through the Climate Science Computational End Station, an Innovative and Novel Computational Impact on Theory and Experiment program award led by NCAR’s Warren Washington on Jaguar in the NCCS.

“For DOE, which is very concerned with the carbon cycle and with the impact of climate change on ecology and ecosystem services, this kind of Earth system model is really called for,” Drake said. “We’re trying to do whatever we can to get there as quickly as possible.”

References

Drake, J. B., P. W. Jones, M. Vertenstein, J. B. White III, and P. H. Worley. 2007. “Software Design for Petascale Climate Science.” Petascale Computing: Algorithms and Applications, ed. D. Bader, Chapman & Hall / CRC Press, Taylor and Francis Group.

Mirin, A. A., and P. H. Worley. 2007. “Extending Scalability of the Community Atmosphere Model.” Journal of Physics: Conference Series, 78, 012082.

Alam, S. R., R. F. Barrett, M. R. Fahey, J. A. Kuehn, J. M. Larkin, R. Sankaran, and P. H. Worley. 2007. “Cray XT4: An Early Evaluation for Petascale Scientific Simulation.” Proceedings of the ACM/IEEE International Conference for High Performance Computing, Networking, Storage and Analysis (SC07), Reno, NV, (Nov. 10–16).