Earthquakes can cause extensive damage and loss of life. And although earthquake prediction is an imperfect and developing science, a failure to predict led to manslaughter convictions for six Italian scientists after the 2012 L’Aquila earthquake. The case was later overturned but underscored the need for improved predictive calculations and more realistic simulations.
The Southern California Earthquake Center (SCEC), using the power of the petascale Blue Waters Supercomputer at the National Center for Supercomputing Applications, University of Illinois at Urbana-Champaign, has developed a physics-based model called CyberShake that simulates how an earthquake works rather than approximating the tremors based on observations. In other words, using data rather than assumption. Recent advances in supercomputing have made this shift possible. Physics-based 3-D simulations are data driven and not limited by the design of the model: more data, less uncertainty. And since 2008, simulations have run about 200 times faster and have created 600 times more data. New workflow programs allow projects to be completed in weeks, not months.
The CyberSkake model could become the foundation of public seismic hazard estimates, according to Philip Maechling, Information Technology Architect at SCES.
Sign up for our insideHPC Newsletter.