In 2012, a small Berkeley, California, startup called KatRisk set out to improve the quality of worldwide flood risk maps. With backgrounds in engineering, hydrology, and risk modeling, the company’s three founders knew that many factors, including annual climate and local infrastructure, affect flood risk.
The team wanted to create large-scale, high-resolution maps to help insurance companies evaluate flood risk on the scale of city blocks and buildings, something that had never been done. But they knew they would need a lot of computing power to reach that level of detail.

KatRisk model shows widespread inland flooding following Hurricane Harvey in Texas in 2017. Image credit: KatRisk, LLC
That’s when CEO Dag Lohmann sought computing time on the nation’s most powerful supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), a US Department of Energy (DOE) Office of Science User Facility at DOE’s Oak Ridge National Laboratory. Through the OLCF’s industrial partnership program, known as Accelerating Competitiveness through Computational Excellence, KatRisk received 5 million processor hours on Titan.
“KatRisk used supercomputing to determine the flood risk for every single building in the United States, essential information used by insurers to price insurance and manage risk,” Lohmann said.
The company leveraged Titan’s GPUs to develop 10 meter by 10 meter resolution flood risk maps for the United States and 90 meter by 90 meter resolution maps, or smaller, worldwide. The KatRisk team focused on combining hydrology models, describing how much water will flow, with computationally intensive hydraulic models that calculate water depth. In this way, the company could predict not only the probability of a flood in a given area but also how bad it might be—an important analysis for the insurance industry.
“Titan helped establish us as one of the leading risk catastrophe modeling companies in the country,” Lohmann said. “These simulations included hydraulic modeling, which is the most time-consuming part of the compute cycle.”
Now, KatRisk is providing high-resolution maps and risk modeling software to the Federal Emergency Management Agency (FEMA), which also administers the National Flood Insurance Program (NFIP). FEMA will use these tools to better evaluate the risk of inland and storm surge flooding in the United States.
As a disaster response agency, FEMA wants to predict as accurately as possible which areas are most likely to fall victim to low-probability, yet catastrophic, events that cause millions of dollars in damages every year. In fact, about 20 percent of all US flood claims are outside of FEMA flood zones.
In addition, when flood zones are not adequately mapped, the aftermath can be extraordinarily costly. This is especially true for property owners whose homes or businesses are devastated in a storm but who did not have flood insurance because their city or neighborhood was not a designated flood zone. Houston’s flooding during Hurricane Harvey last fall is a notable example of how major floods can strike outside FEMA flood zones, harming homes without flood insurance.
Finally, more accurate flood maps will reinforce NFIP’s risk management by better aligning the program’s insurance premiums with expected floods and insurance claims.
After developing KatRisk maps on Titan, the company used an in-house GPU cluster and cloud computing to build its software suite for predicting and analyzing flood risk, including the tool SpatialKat, which models 50,000 years of climate-conditioned event scenarios—another industry first for catastrophe modeling.
“With some knowledge about the climate system and initial conditions, we can make probabilistic statements about the chance for flood annually,” Lohmann said. “Going into 2018 we know how much it has rained, or how much snowpack or drought a given area has had. So maybe normally an area has a 1-in-200 chance of flooding, but this year, it’s a 1-in-100 chance.”
The role of high-performance computing (HPC) in KatRisk flood modeling and the recent FEMA project are the subjects of a talk Lohmann will give on April 17 at the 2018 HPC User Forum titled “Using ORNL’s Titan to develop 50K years of flood risk scenarios for the National Flood Insurance Program.”
“Titan is a resource enabled by the taxpayer and now, with some added value, we are able to provide new resources back to the taxpayer through FEMA,” Lohmann said.
ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.