Skip to main content

Jack Wells, OLCF Director of Science, organized 3 Focus Sessions on “Petascale Science and Beyond: Applications and Opportunities in Materials Science and Chemistry” at the March Meeting of the American Physical Society.

Jack Wells, OLCF Director of Science, organized 3 Focus Sessions on “Petascale Science and Beyond: Applications and Opportunities in Materials Science and Chemistry” at the March Meeting of the American Physical Society.

March Meeting Provides Opportunity for Focus on Petascale Science
The American Physical Society’s (APS’s) annual March meeting is the largest gathering of physicists in the world. Earlier this year, the weeklong conference brought more than 10,000 people to San Antonio to discuss the latest advances in materials and chemical physics, as well as other research areas such as computational physics. The US Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL) and Oak Ridge Leadership Computing Facility (OLCF) were well represented with several speakers discussing their work on Titan, a Cray XK7 capable of 27 petaflops, or 27 quadrillion calculations per second, located at the OLCF, a DOE Office of Science User Facility.

“If you’re in the field, you want to be there,” said Jack Wells, the Director of Science for the National Center for Computational Sciences.

The March meeting has a unique structure with a variety of parallel sessions featuring invited talks from high-level experts, contributed talks from any APS member, and topical focus sessions.

The program of focus sessions is set each year from proposals submitted by the APS community, and this flexibility keeps the focus on issues that are important and timely to the community at large. In 2015, for the second year in a row, Wells proposed a focus session to the program chair of the APS Division of Computational Physics titled “Petascale Science and Beyond: Applications and Opportunities in Materials Science and Chemistry.”

Following is a sampling of talks within this focus session, which communicated results produced on Titan.

ORNL researcher Paul Kent gave an invited talk on the use of quantum Monte Carlo (QMC) methods to produce accurate predictions of complex cuprate materials. Members of this class of materials are known to be high-temperature superconductors, so it’s incredibly important to have accurate predictions about their behavior. Cuprates are made of copper, oxygen, and one other transition metal; the transition metal is known to cause problems for accurate simulations. Traditional state-of-the-art methods like Density Functional Theory (DFT) run into difficulties simulating certain electrons. There are some empirical ways to fix the problem, but doing so negates the possibility of creating ab initio simulations—calculations that rely only on the most fundamental theory.

QMC is an efficient approach to solving these problems because it uses statistical data to sample the solution to the relevant equations. Wells compares the QCM technique to predicting the results of an election by taking a poll. The more complete and representative your sample is, the more accurate your prediction will be. Titan’s capabilities allowed Kent’s team to perform the first truly ab initio simulation of magnetic couplings. This research opens the door for future researchers to work on improving the method’s accuracy and applying it to other systems, like OLCF’s highly anticipated Summit supercomputer coming to ORNL in 2018.

Peter Staar, currently a researcher with IBM Zürich Research Laboratory, demonstrated his team’s simulations of high-temperature (high-Tc) superconductors on Titan. The team ran the simulations using DCA+, an algorithm for which Staar was named a Gordon Bell Prize finalist in 2013. DCA+ is scaled to the 18,688 node Titan system and took full advantage of the system’s NVIDIA GPUs, reaching 15.4 petaflops. These improvements reduced the impact of two major problems that plague research on dynamic cluster QMC simulations: the fermionic-sign problem and cluster shape dependency.

Electrons, a type of fermion, limit the cluster size and the lowest temperature that can be simulated. DCA+ arrives at a solution nearly 2 billion times faster than its predecessor, DCA++, which allows room for more atoms at lower temperatures. Although this doesn’t solve the fermionic-sign problem in and of itself, it does enable more accurate and useful simulations in which the notorious sign problem is removed as a practical obstacle.

The second challenge that researchers run into is the cluster shape dependency. Before, you would get significantly different results for the superconducting transition temperature, depending on the shape of the cluster that was simulated, but with DCA+, researchers are able to get consistent results even with different cluster shapes.

These algorithmic improvements allowed Staar and his team to show for the first time converged solutions for the single-band Hubbard model that exhibit a superconducting phase transition with electron couplings (i.e., Cooper pairs) possessing d-wave angular momentum symmetry—consistent with the known experimental results.

Martin Berzins, a professor of computer science at the University of Utah, gave an invited talk where he demonstrated Uintah, a software framework he developed to study combustions and interactions with solids.

What makes Uintah unique is its ability to take advantage of a massively parallel system like Titan using automatic task-based parallelization. It does this by first defining the necessary tasks to be performed and analyzing their dependencies. If two tasks are independent of each other, they can be run at the same time, making the computational process much faster. Berzins’ presentation used examples from large-scale combustion problems, industrial detonations, and multi-physics models, demonstrating the versatility of its application before discussing the challenges of scaling such calculations to the next generation of supercomputers.

Xiaolin Cheng, a computational researcher with the Computer Science and Mathematics Division at ORNL, gave a contributed talk about his work on biological membranes and the role that petascale computing can play in advancing chemistry and biology.

Lipid bilayers are fundamental to much of biology, and understanding their lateral structure and organization has become a hot topic in the community. However, not much is known about lipid bilayers because they’re hard to image. Traditional x-rays don’t work well because lipid bilayers are composed mainly of light elements such as hydrocarbons, fat, and water. Neutrons, however, do an excellent job of overcoming these issues.

Modeling these membranes requires running very large molecular dynamic (MD) simulations that replicate the physical movements of ensembles of atoms and molecules and comparing the results to experimental data. Cheng spoke about his work developing MD techniques that take advantage of the capabilities of massively parallel supercomputers like Titan and discussed several examples of large-scale MD simulations.

—Christie Thiessen

Oak Ridge National Laboratory is supported by the US Department of Energy’s Office of Science. The single largest supporter of basic research in the physical sciences in the United States, the Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.