Articles in the Technology Category
In response to a growing interest in data services that are integrated with high-performance computing, the National Center for Computational Sciences has expanded its data analysis group, the Advanced Data and Workflow group.
The OLCF’s new NVIDIA DGX-1 deep learning system is offering scientists opportunities to explore deep learning’s potential to leverage big data analytics to automate and accelerate the scientific discovery process.
Staff is testing OpenShift as a way for users to independently deploy and manage scientific workflows on OLCF systems.
OLCF staff members recently built and ran a technology called containers, which bundle an operating system and software into a single file and make it easier for researchers to run deep learning software on OLCF supercomputers.
The OLCF’s new ARM1 early development system gives researchers the opportunity to test various software packages and explore an experimental environment for ARM architecture–based systems.
While upgrading chemistry applications in the pre-Summit development environment known as Summitdev, Oak Ridge Leadership Computing Facility staff fixed an unexpected bottleneck in a key tensor algebra library, boosting performance by as much as 10 times.
In 2016, the OLCF introduced a new runtime framework that allows users of hybrid systems—such as the OLCF’s 27-petaflop Titan—to better exploit GPU-accelerated architectures.
OLCF staff created the Constellation DOI framework that makes it possible for a researcher to obtain a digital object identifier (DOI) to catalog and publish scientific data artifacts for open access.
OLCF staff members led a workshop January 10–12 to provide CAAR teams with a quickstart guide for using the Summitdev early access system, which features IBM’s POWER processors and NVIDIA’s Pascal GPU–based architecture.
Leveraging the high-performance computing and data resources provided by the OLCF and CADES, neuroscience researchers are advancing the computational modeling of individual neurons.