Researchers used the world’s fastest supercomputer for open science to make the world’s largest artificial intelligence model for weather prediction even larger, more detailed, and more accurate.
The second version of the Oak Ridge Base Foundation Model for Earth System Predictability, or ORBIT-2, dials in up-to-the-minute forecasts down to the doorstep level with help from Frontier, the flagship supercomputer at the Department of Energy’s Oak Ridge National Laboratory.
Training on Frontier, the 2-exaflop HPE Cray EX supercomputing system at ORNL’s Oak Ridge Leadership Computing Facility (OLCF), pushed the model past longtime computational limits to achieve unparalleled speed and accuracy in its predictions, researchers said.
“When it comes to AI capability, we’re pushing the limits of extreme scale,” said Xiao Wang, an ORNL computational research scientist and lead author of the ORBIT-2 study. “Results like these once required a large supercomputer running for days to obtain a prediction this specific. Today, we can obtain these results in milliseconds at close to 99 percent accuracy, and we can refine those results to see the microvariations across neighborhoods and even from house to house. DOE has a unique capability in Frontier that allowed this project to happen.”
The study earned the team a finalist nomination for the Association for Computing Machinery’s Gordon Bell Special Prize, which honors innovations in applying high-performance computing to weather modeling applications. This year’s prize will be presented at the International Conference for High Performance Computing, Networking, Storage, and Analysis, or SC25, which takes place Nov. 16-21 in St. Louis, Missouri.
The team will present results of their study, also nominated for a Best Paper Award at SC25, on Tuesday, Nov. 18.

The second version of the Oak Ridge Base Foundation Model for Earth System Predictability, or ORBIT, delivers weather forecasts in unprecedented detail with help from Frontier, the flagship supercomputer at ORNL.
The ORBIT-2 study builds on the team’s initial ORBIT model, a previous nominee for the Gordon Bell Special Prize. The first model focused on long-range weather forecasting and generated highly accurate predictions up to 30 days in advance.
ORBIT-2 refines those forecasts to precise locations and likely consequences. The model continually evaluates predictions for accuracy and rates confidence levels for each forecast to help control for errors.
“We can use ORBIT-1 to forecast the weather up to 30 days in advance for several regions and ORBIT-2 to refine those forecasts at a particular location to support impact modeling,” said Dan Lu, an ORNL Earth system model developer and senior author of the study. “Weather is a primary driver affecting agriculture, infrastructure, the energy grid and many other critical sectors. We can use this data to project energy needs and inform preparations for first responders and other agencies. For example, we’re collaborating with the Tennessee Valley Authority to use our models to forecast water levels and inform controlled releases across their dam networks.”
Popular AI models such as OpenAI’s ChatGPT rely on large language models, which learn to recognize and predict patterns in word fragments by using large datasets, often scraped from the web. ORBIT-2 relies on a similar foundation model that learns by analyzing a broad range of weather data that includes cloud formations, humidity, temperature and geography. ORBIT-2 leveraged Frontier’s AMD GPUs to train its massive AI models.
“ORBIT-2 marks a defining moment for AI-driven science,” said Prasanna Balaprakash, ORNL’s director of AI programs. “By combining exascale computing with scalable AI foundation models, we are moving toward true Earth-system intelligence — one that learns, generalizes, and delivers insight at planetary scale. This achievement exemplifies how AI and high-performance computing can work hand in hand to accelerate discovery and deliver actionable knowledge for humanity.”
The increased range of sources vastly complicated the model’s computational demand. That level of complexity outmatches the capabilities of most supercomputers — but not Frontier and its nearly 10,000 nodes.
“We’ve expanded our model to incorporate not just the previous parameters but more than 4.4 billion token sequences,” Wang said. “Think of tokens as the data equivalent of individual letters and punctuation and parameters as words and phrases. Token sequences are increasingly larger and more complex data combinations — from sentences to paragraphs and combinations of paragraphs.
“Unlike text, which is one-dimensional, our weather data is four-dimensional. That makes every equation four times, 16 times, 64 times longer and more as we go, which makes training our model significantly more computationally demanding than most state-of-the-art weather models.”
The team plans to publicly release ORBIT-2’s source code on sites such as Hugging Face and GitHub. The datasets used for training ORBIT-2 have been released on the OLCF’s Constellation repository. Constellation enables OLCF users to publish results of their work carried out on OLCF supercomputers such as Frontier and assists projects in meeting federal requirements for research reproducibility and data availability.
“What we’ve achieved with this model could be translated to a wide range of applications in a variety of fields — from hydrology to energy and many others,” Wang said. “We want to see as many people as possible benefit from these capabilities.”
Besides Balaprakash, Wang, and Lu, the ORBIT-2 team included Jong-Youl Choi, Takuya Kurihana, Isaac Lyngaas, Hong-Jun Yoon, David Pugmire, Ming Fan, Nasik Nafi, Aristeidis Tsaris, Maliha Hossain, Dali Wang, Peter Thornton, and Moetasim Ashfaq of ORNL; Ashwin M. Aji of AMD; Mohamed Wahib of the Riken Center for Computational Science in Kobe, Japan; and Xi Xiao of the University of Alabama at Birmingham.
This research was supported by the ORNL AI Initiative and the DOE Office of Science’s Advanced Scientific Computing Research program. The OLCF is a DOE Office of Science user facility at ORNL.
UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.



