‘Let’s Talk Exascale’: How Supercomputing Is Shaking Up Earthquake Science

Print Friendly, PDF & Email

Supercomputing is bringing seismic change to earthquake science. A field that historically has predicted by looking back now is moving forward with HPC and physics-based models to comprehensively simulate the earthquake process, end to end.

In this episode of the “Let’s Talk Exascale” podcast series from the U.S. Department of Energy’s Exascale Computing Project (ECP), David McCallen, leader of ECP’s Earthquake Sim (EQSIM) subproject, discusses his team’s work to help improve the design of more quake-resilient buildings and bridges. McCallen, professor and director of the Center for Civil Engineering Earthquake Research at the University of Nevada, Reno, and a senior scientist at Lawrence Berkeley National Laboratory, examines the critical role of supercomputing – and the anticipated benefits of exascale supercomputing – in advancing the science.

“Earthquakes are a tremendous worldwide hazard,” McCallen says, “with thousands of people killed in an average year. There are certainly hotspots in the US. We’ve been lucky. We’ve had a relatively quiescent period, but there will be large future earthquakes in the San Francisco Bay area, in Washington state, and the Cascadia subduction zone in the future. And, really, the types of codes that we’re developing will ultimately—once we achieve the simulation performance we need to achieve—help inform how to better design and better account for all this complexity in earthquake science and earthquake engineering….”