Skip to main content

“The massive data sets that will be produced by the massive radio telescopes being constructed today will push the limits of available computational technologies. New breakthroughs are required in the integrated compute and data methods, algorithms, and technologies. Titan’s massively parallel, hybrid-accelerated architecture effectively gives radio astronomers a view into their computing future.” – OLCF Director of Science Jack Wells

“The massive data sets that will be produced by the massive radio telescopes being constructed today will push the limits of available computational technologies. New breakthroughs are required in the integrated compute and data methods, algorithms, and technologies. Titan’s massively parallel, hybrid-accelerated architecture effectively gives radio astronomers a view into their computing future.”
– OLCF Director of Science Jack Wells

OLCF Director of Science details impact, promise of HPC on radio astronomy

The OLCF’s Jack Wells recently brought radio astronomers up to speed on the latest developments and potential breakthroughs in computational astrophysics.

Wells was an invited speaker at the American Astronomical Society’s (AAS’s) Exascale Radio Astronomy conference from March 30 to April 4 in Monterrey, CA, where he detailed recent advances in computational astrophysics on Department of Energy supercomputing systems such as Titan.

AAS is the premier professional organization for North American astronomers that aims to increase understanding of the universe through disseminating research findings, facilitating interactions among astronomers, representing the North American community on the global stage, training and mentoring next-generation researchers, and helping its members to hone their educational and research skills.

High-performance computing (HPC) has greatly revolutionized astrophysics over the last half decade as systems have entered the petascale. This trend will almost certainly continue to the exascale, as supercomputing systems achieve an order of magnitude more power over today’s leading machines.

Radioastronomy is a data-intensive branch of astronomy that uses radio waves to study bodies in space. Due to its data-intense nature, supercomputers are an essential tool for not only simulating the universe and its elements but for also helping researchers to organize, analyze, and visualize the data in an effort to help researchers achieve breakthroughs in shorter and shorter timeframes.

Wells’ talk, “What does Titan tell us about preparing for exascale supercomputers?,” had 3 aims: to reveal recent results from breakthrough astrophysics simulations made possible by modern application software and leadership-class compute and data resources; to give prospects and opportunities for today’s petascale problems; and to highlight the computation needs of astrophysics requiring exascale compute and data capability.

Members of three separate user projects also presented on the nexis between RA and supercomputing.

Dr. Pierre Ocvirk, from the University of Strasbourg in France, is a collaborator with Paul Shapiro’s (U. of Texas at Austin) INCITE project, “Witnessing our own cosmic dawn.” Ocvirk presented early results from Titan’s simulation of the Epoch of Reionization of the Cosmos. In Big Bang cosmology, the Epoch of Reionization (EoR) is a term used to describe the period during which the gas in the Universe went from being almost completely neutral to a state in which it became almost completely ionized. This watershed event—which has occurred when the Universe was a few hundred million years old (about a twentieth of its current age) and the first radiating objects formed—is intimately linked to many fundamental questions in cosmology and structure formation and evolution.

Lincoln Greenhill of Harvard University spoke about the first one hundred million years of the universe, a time before stars, a time for which there are no data that constrain theoretical models.  Radio telescope array projects aim to deliver much needed data, but these (for the first time in the discipline) require supercomputing. Greenhill spoke about how his LEDA project has used off-the-shelf hardware (GPUs and servers) with great efficiency and in a scalable configuration that would serve the larger petascale arrays needed to meet next-generation studies of the “Dark Age.”

And two of Greenhill’s collaborators, Harvard’s Ben Barsdell and GPU-maker NVIDIA’s Mike Clark, presented on the potential and pitfalls of the HPC/RA relationship

Barsdell discussed “Petascale Cross-Correlation: Extreme Signal Processing Meets HPC,” in which he detailed the potential and missed opportunities between supercomputing and radio astronomy. While existing tools and techniques aren’t always suitable, he concluded, it would be wise for RA to pay close attention to upcoming software and hardware advances that would be trivial to integrate and have tremendous potential to assist RA in tackling some of its most complex problems.

Likewise, Clark saw plenty of symbiotic opportunity on the horizon for RA and high-performance computing. Astronomy’s processing pipeline is ideal for many-core processors, said Clark, adding that key algorithms possess locality and high degrees of parallelism. In fact, Clark concluded, perhaps no other domain is as well-suited for the exascale as is astronomy.

“The massive data sets that will be produced by the massive radio telescopes being constructed today will push the limits of available computational technologies,” said Wells. “New breakthroughs are required in the integrated compute and data methods, algorithms, and technologies. Titan’s massively parallel, hybrid-accelerated architecture effectively gives radio astronomers a view into their computing future.”