Skip to main content

ORNL computing experts refine approach to improving efficiency and lowering cost of decommissioning and disposing of large computing resources safely and securely

Ever wonder what happens to massive supercomputing systems when they’re retired? Surprisingly, when it comes to the data, it’s not too different from disposing of old documents — they go straight into a shredder and sent to recycling.

At the end of 2023, the Summit supercomputer — formerly the world’s most powerful, located at the Department of Energy’s Oak Ridge National Laboratory — was scheduled to be  decommissioned and dismantled in preparation for building the lab’s next world-leading supercomputer. But because of the machine’s prolific productivity, the decision was made to continue operating Summit through 2024.

However, the additional year of allocations through the SummitPLUS program required replacing of the old, failing high-performance storage system, Alpine, with Alpine2. Crews began dismantling the Alpine storage system over the summer.

“Summit was designed to run huge simulations on supernovae and fusion reactors,” said Paul Abston, group leader for infrastructure operations at ORNL’s National Center for Computational Sciences. “You’d be hard pressed to find a place that has more hard drives than us, maybe besides Amazon, Google or Microsoft. So, taking Alpine apart is a big job, and, of course, safety and security are number one.”

Launched in 2018, the Summit supercomputer is currently ranked No. 9 on the TOP500 list of the world’s most powerful supercomputers. Alpine, an IBM Spectrum Scale parallel file system — managed by the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility — is used to temporarily store data from Summit and other support systems including Andes, a computing cluster for pre- and post-processing of Summit simulation data.

The Alpine system is composed of 40 cabinets that occupy approximately 1,400 square feet of floorspace. Alpine’s 250 petabytes of disk space are provided by 32,494 hard drives. Each drive is approximately 6 inches long by 4 inches wide and weighs a little over a pound.

“Each one of those 32,000-plus drives must be physically removed one at a time by hand. That’s about 20 tons of hardware that we have to process,” said Abston.

To ensure any remaining data on the hard drives is protected, once the hard drives are removed from the cabinets they are placed in a locked bin and taken to a secure location to be physically destroyed. That’s where the shredder comes in.

‘It’s a lot like a woodchipper’

The shredder, supplied and operated by ShredPro Secure, a small business in East Tennessee, is a mobile unit about 4 feet wide and stands waist-high. The hard drives are fed by the technician into an opening at the top of the machine, where counter-rotating metal teeth tear the drives apart and reduce them to small, irregular strips a few inches in size. The mobile shredder can shred one hard drive every 10 seconds, with a theoretical capacity to process up to 3,500 hard drives a day.

“It’s a lot like a woodchipper. The teeth of the shredder tear the drives into tiny pieces, making it impossible to reconstruct into a functioning drive,” Abston said. “Even though we’re not dealing with classified data, the data still belongs to the users, and we have a responsibility to make sure it’s protected.”

After the drives are shredded, a conveyor belt gathers the material and deposits the waste into a bin, which is then transferred to larger containers and taken to be recycled through ORNL’s metal recycling program.

“Any metal that we recycle, the money comes directly back to the Oak Ridge National Laboratory budget. So, not only is this an environmentally friendly approach, it’s also more budget friendly,” Abston said.

Technician places obsolete hard drives into container to be safely destroyed.

ORNL staff removed and recycled more than 32,000 hard drives belonging to the Alpine high-performance storage system in preparation for building the lab’s next world leading supercomputer. Credit: Angela Gosnell/ORNL, U.S. Dept. of Energy

Passing the savings on

Decommissioning major computing systems is an evolving process that Abston and his team have refined over the years.

The last time they decommissioned a system like Alpine was in 2019 with the Atlas storage system. With approximately 20,000 hard drives, Atlas was roughly two-thirds the size of Alpine. Regardless, Abston recalled, doing everything in-house took the team 9 months to complete the job, and at a substantially higher cost.

Working with an outside vendor allowed the team to process drives from additional support systems beyond Alpine, increasing the workload by about 10,000 hard drives. As a result, they completed twice the amount of work in under 2 months, a task that previously took 9 months — and at a significantly lower cost.

What’s more, the experience provided a business case for the lab to purchase its own shredder for use on future projects, which will allow ORNL to pass on even more savings and improve data security.

“Shredding on-site at our facility, in the long run, means we’re gonna come out with a much cheaper disposition that saves taxpayers money,” said Abston.

User data previously stored on Alpine was transferred to other OLCF storage systems. Summit will continue operating until Nov. 1, 2024. On November 19, Alpine2 will be switched to read-only for Summit and will then be reconfigured into a nearline storage system supporting other OLCF data capabilities.

UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit energy.gov/science.

Jeremy Rumsey

Jeremy Rumsey is a senior science writer and communications specialist at Oak Ridge National Laboratory's Oak Ridge Leadership Computing Facility. He covers a wide range of science and technology topics in the field of high-performance computing.