Two different studies produced by the Analytics and AI Methods at Scale (AAIMS) group, which resides within the National Center for Computational Sciences at the US Department of Energy’s (DOE’s) …
Read More
Managing a supercomputer requires operators to collect and analyze data generated from the machine itself. This data includes everything from job failure rates to power consumption figures. Researchers at the …
Read More
To analyze and process scientific data, researchers often employ Jupyter notebooks, interactive web documents that host snippets of code written in statistical programming languages such as Python (or R or Julia). …
Read More
Since 1987, the Association for Computing Machinery has awarded the annual Gordon Bell Prize to recognize outstanding achievements in high-performance computing (HPC). Presented each year at the International Conference for …
Read More
Analyses of lung fluid cells from COVID-19 patients conducted on the nation’s fastest supercomputer point to gene expression patterns that may explain the runaway symptoms produced by the body’s response …
Read More
Since bursting onto the scene in the early ’90s, high-performance computing (HPC) has become the most productive method for exploring ambitious problems in science that require substantial computational power. Yet …
Read More
Every life-form depends on access to basic nutrients for survival—even microbes that live in the soil. Though these organisms are invisible to the naked eye, their soil scavenging activity has …
Read More
At the home of America’s most powerful supercomputer, the Oak Ridge Leadership Computing Facility (OLCF), researchers often simulate millions or billions of dynamic atoms to study complex problems in science …
Read More