When a climate modeler looks at Earth, he or she sees a world of nonlinear systems—systems that at first glance appear chaotic or random but are, in fact, just incredibly complex.
Energy from one source has many ramifications throughout the climate system. In a simple example, energy in the form of sunlight can affect air and water temperature. These temperature changes may in turn impact vegetation yield, causing rich plant growth in some regions and famine in others. Or temperature changes may impact water availability, introducing heavier floods in some regions and water stress in others.
Tracking all that energy—the energy that melts ice caps, fuels storm systems, and nourishes vegetation—is the job of climate models. Tracking energy is how global climate simulations project changes in average temperature and sea level, but there’s much more to learn.
The Community Earth System Model (CESM), which is supported by the Department of Energy (DOE) and National Science Foundation, is a global model for simulating Earth’s past, present, and future climate. CESM, released in its latest version in June 2014, is expected to project complex climate scenarios with greater fidelity than previous community models, such as its predecessor the Community Climate System Model (CCSM), which has been the predictive tool used for many climate assessments over the last 18 years.
CESM models climate patterns at higher spatial resolution than previous models and includes new equations governing atmospheric chemistry, biogeochemical processes, and energy absorption and emission. These improvements will lead to better assessments of regional climate details and improve understanding of how extreme weather statistics may change in the future.
A team of computational and climate scientists are scaling the model to run on DOE petascale computers, including Titan, the 27-petaflop Cray XK7 system managed by the Oak Ridge Leadership Computing Facility (OLCF) at Oak Ridge National Laboratory (ORNL), and Mira, the IBM Blue Gene/Q system at Argonne National Laboratory (ANL).
“We’re running climate model resolutions typically used by weather forecast models, which run at much higher resolutions because they are only looking at a few days at a time,” said principal investigator Mark Taylor of Sandia National Laboratories.
CESM is expected to better serve policymakers and stakeholders who plan for regular severe weather like storms, droughts, tropical cyclones, and floods. It is also expected to improve the information provided to households and businesses that plan for severe weather, shifting seasonal norms, sea level rise, or other environmental factors.
“CESM is a key tool that will allow us to better understand the science underpinning these phenomena and how they might be affected by climate change,” Taylor said.
But before CESM can begin to predict any of that, it has to run properly on the supercomputers that will crunch quadrillions of calculations per second to zoom into local environments and track more energy exchanges between CESM’s four components: atmosphere, land, ice, and ocean. The components are simulated with the Community Atmosphere, Community Land, and Community Ice Code Models and the Parallel Ocean Program, respectively.
To prepare CESM for scientific research, the multi-lab team completed simulations for performance evaluations in two major stages. The first stage was to calibrate CAM5 (the fifth version of the Community Atmosphere Model), which received many upgrades for the transition from CCSM to CESM.
A new atmosphere model
During the planning stages for Titan several years ago, Taylor and other team members worked with software engineers to replace CESM’s atmospheric dynamical core, then OLCF staff began preparing the new dynamical core for Titan’s GPUs as part of the OLCF’s Center for Accelerated Application Readiness.
Although this first round of CESM performance runs was conducted on CPUs only, future projects will scale the model to Titan’s GPUs. These CPU runs provided benchmarks against which researchers can prepare for the best utilization of GPUs.
Based on these early performance tests, Taylor predicts future versions of CESM will run about 1.5 times faster with GPU acceleration, cutting estimated computing time for a 30-to-40-year, coupled CESM simulation (the separate land, ocean, ice, and atmosphere models running in tandem) from 90 million to 64 million Titan core hours.
That anticipated speedup begins with CAM5’s new dynamical core, which solves the partial differential equations governing atmospheric fluid dynamics.
“The previous dynamical cores were based on latitude/longitude grids, but they could only efficiently use as many processors as there are latitude bands,” Taylor said. “We went to a more uniform grid in which each processor gets a small, square patch of Earth. The ability to decompose the planet into smaller pieces means that we can take better advantage of the parallelism of these machines.”
The new dynamical core uses a spectral finite element method ideally suited for hybrid CPU–GPU machines like Titan. Running computations in parallel, or simultaneously, on the GPU accelerators will enable Titan to include more detail, such as a suite of more than 100 chemical reactions and increased vertical resolution.
Typically CAM5 has been run at 110-kilometer (68-mile) grid spacing, but by using the new dynamical core on Titan, the distance shrinks to 27-kilometer (17-mile) resolution or even finer.
CAM5 also includes the addition of 25 aerosols, small particulates in the atmosphere from natural and anthropogenic sources.
“CAM5 includes new thermodynamic models that better represent the indirect effect of aerosols, which influences cloud formation and precipitation,” said Kate Evans, a coprincipal investigator. The impact of aerosols on net anthropogenic forcing (or the influence of human activity on radiative forcing that causes warming or cooling) is listed as a key uncertainty in the Intergovernmental Panel on Climate Change’s Fifth Assessment Report, and simulating aerosols can help researchers better understand factors behind temperature, the formation of clouds, and precipitation.
A balancing act
After scaling CAM5 to Titan, the team ran preindustrial, coupled simulations on Titan and Mira to establish thermodynamic equilibrium.
“We get the atmosphere calibrated then we run preindustrial control simulations, or simulations without additional man made greenhouse gases,” Taylor said.
To effectively track energy, climate models must be in thermodynamic equilibrium (or balanced in energy) so researchers can measure changes. Because burning fossil fuels for power gained exceptional momentum following the Industrial Revolution, accelerating the release of carbon dioxide and other greenhouse gases, scientists calibrate climate models based on preindustrial averages for temperature, humidity, and other parameters.
“If the model is not in balance when you begin present day ensemble runs, you’ll have drift,” Taylor said. By starting in thermodynamic equilibrium, researchers can better determine how (and how much) human activity is influencing climate.
Because CESM includes more complex physics calculations than previous models and demands higher resolution, a big part of the team’s job is assigning values to newly introduced parameters and adjusting existing parameters for higher resolution.
“It’s all about getting the model as realistic as possible,” Taylor said. “Some parameters of the model are not well understood scientifically, such as the relationship between temperature, pressure, moisture, and cloud formation, so we compare the model runs to observation to see if we need to adjust the parameters.”
This second stage, completed this summer, simulated climate under controlled conditions for 35 years and included all four components, further fine tuning the model’s software.
“Once we run our preindustrial control simulation, we should be able to make comparisons to show that CESM is doing a better job than the older models,” Taylor said. “Once we’ve gone through this process, we’re hoping we will have a climate model that agrees well with what we’ve seen in observation so we can begin the present day ensembles with confidence.”