Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Video: Supercomputing Dynamic Earthquake Ruptures

Researchers are using XSEDE supercomputers to model multi-fault earthquakes in the Brawley fault zone, which links the San Andreas and Imperial faults in Southern California. Their work could predict the behavior of earthquakes that could potentially affect millions of people’s lives and property. “Basically, we generate a virtual world where we create different types of earthquakes. That helps us understand how earthquakes in the real world are happening.”

Gordon Bell Prize Highlights the Impact of Ai

In this special guest feature from Scientific Computing World, Robert Roe reports on the Gordon Bell Prize finalists for 2018. “The finalist’s research ranges from AI to mixed precision workloads, with some taking advantage of the Tensor Cores available in the latest generation of Nvidia GPUs. This highlights the impact of AI and GPU technologies, which are opening up not only new applications to HPC users but also the opportunity to accelerate mixed precision workloads on large scale HPC systems.”

How HPC can Benefit Society

Sharan Kalwani writes that that one of the main reasons he got into computing long ago was the potential that he saw in using powerful news supercomputing tools towards addressing the needs of the society we live in. “The point here is that technically the challenges are very tractable, however society needs to also grow at a comparable, if not higher pace than the technology (which is evolving faster than us).”

Video: Kathy Yelick from LBNL Testifies at House Hearing on Big Data Challenges and Advanced Computing

In this video, Kathy Yelick from LBNL describes why the US needs to accelerate its efforts to stay ahead in AI and Big Data Analytics. “Data-driven scientific discovery is poised to deliver breakthroughs across many disciplines, and the U.S. Department of Energy, through its national laboratories, is well positioned to play a leadership role in this revolution. Driven by DOE innovations in instrumentation and computing, however, the scientific data sets being created are becoming increasingly challenging to sift through and manage.”

Unravelling Earthquake Dynamics through Extreme-Scale Multiphysics Simulations

Alice-Agnes Gabriel gave this talk at the PASC18 conference. “Earthquakes are highly non-linear multiscale problems, encapsulating geometry and rheology of faults within the Earth’s crust torn apart by propagating shear fracture and emanating seismic wave radiation. This talk will focus on using physics-based scenarios, modern numerical methods and hardware specific optimizations to shed light on the dynamics, and severity, of earthquake behavior.”

How Exascale will Move Earthquake Simulation Forward

In this video from the HPC User Forum in Tucson, David McCallen from LBNL describes how exascale computing capabilities will enhance earthquake simulation for improved structural safety. “With the major advances occurring in high performance computing, the ability to accurately simulate the complex processes associated with major earthquakes is becoming a reality. High performance simulations offer a transformational approach to earthquake hazard and risk assessments that can dramatically increase our understanding of earthquake processes and provide improved estimates of the ground motions that can be expected in future earthquakes.”

Video: HPC Use for Earthquake Research

Christine Goulet from the Southern California Earthquake Center gave this talk at the HPC User Forum in Tucson. “SCEC coordinates fundamental research on earthquake processes using Southern California as its principal natural laboratory. The SCEC community advances earthquake system science through synthesizing knowledge of earthquake phenomena through physics-based modeling, including system-level hazard modeling and communicating our understanding of seismic hazards to reduce earthquake risk and promote community resilience.”

PASC18 Keynote to Focus on Extreme-Scale Multi-Physics Earthquake Simulations

Today the PASC18 conference announced that Alice-Agnes Gabriel from Ludwig-Maximilian-University of Munich will deliver a keynote address on earthquake simulation. ” This talk will focus on using physics-based scenarios, modern numerical methods and hardware specific optimizations to shed light on the dynamics, and severity, of earthquake behavior. It will present the largest-scale dynamic earthquake rupture simulation to date, which models the 2004 Sumatra-Andaman event – an unexpected subduction zone earthquake which generated a rupture of over 1,500 km in length within the ocean floor followed by a series of devastating tsunamis.”

Hayward Fault Earthquake Simulations Increase Fidelity of Ground Motions

Researchers at LLNL are using supercomputers to simulate the onset of earthquakes in California. “This study shows that powerful supercomputing can be used to calculate earthquake shaking on a large, regional scale with more realism than we’ve ever been able to produce before,” said Artie Rodgers, LLNL seismologist and lead author of the paper.”

SDSC Earthquake Codes Used in 2017 Gordon Bell Prize Research

A Chinese team of researchers awarded this year’s prestigious Gordon Bell prize for simulating the devastating 1976 earthquake in Tangshan, China, used an open-source code developed by researchers at the San Diego Supercomputer Center (SDSC) at UC San Diego and San Diego State University (SDSU) with support from the Southern California Earthquake Center (SCEC). “We congratulate the researchers for their impressive innovations porting our earthquake software code, and in turn for advancing the overall state of seismic research that will have far-reaching benefits around the world,” said Yifeng Cui, director of SDSC’s High Performance Geocomputing Laboratory, who along with SDSU Geological Sciences Professor Kim Olsen, Professor Emeritus Steven Day and researcher Daniel Roten developed the AWP-ODC code.