Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Hayward Fault Earthquake Simulations Increase Fidelity of Ground Motions

Researchers at LLNL are using supercomputers to simulate the onset of earthquakes in California. “This study shows that powerful supercomputing can be used to calculate earthquake shaking on a large, regional scale with more realism than we’ve ever been able to produce before,” said Artie Rodgers, LLNL seismologist and lead author of the paper.”

Podcast: Proxy Apps set the stage for Exascale Computing Project

In this Let’s Talk Exascale podcast, David Richard from LLNL discussed the Proxy Apps project, which is curating a collection of proxy apps that will represent the real applications of importance to the ECP. “In this Let’s Talk Exascale podcast, David Richard from LLNL discussed the Proxy Apps project, which is curating a collection of proxy apps that will represent the real applications of importance to the ECP.”

Let’s Talk Exascale Podcast: ECP’s Application Assessment Project

In this Let’s Talk Exascale podcast, Kenny Roche from Pacific Northwest National Laboratory describes the ECP’s Application Assessment Project. “With Productivity, we’re aiming to coordinate how to improve software productivity of the code teams and to sustain that performance and development progress to the completion of each ECP application code project.”

Let’s Talk Exascale Podcast looks at the GAMESS Project for Computational Chemistry

In this episode of Let’s Talk Exascale, Mike Bernhardt from ECP discusses the GAMESS project with Mark Gordon from Ames Laboratory. Gordon is the principal investigator for the ECP project called General Atomic and Molecular Electronic Structure System (GAMESS), which is developing methods for computational chemistry. “Just about anything a user would want to do with computational chemistry can be done with GAMESS—everything from very high-level quantum chemistry to semi-empirical methods to classical force fields.”

Report: Future Software and Data Ecosystem for Scientific Inquiry

“The tremendous progress that we’re making toward the achievement of exascale systems, both here in the United States and in the European Union and Asia, will be undermined unless we can create a shared distributed computing platform to manage the logistics of massive, multistage data workflows with their sources at the network edge. Backhauling these rivers of data to the supercomputing center or the commercial cloud will not be a viable option for many, if not most applications.”

The U.S. D.O.E. Exascale Computing Project – Goals and Challenges

Paul Messina from Argonne gave this Invited Talk at SC17. “Balancing evolution with innovation is challenging, especially since the ecosystem must be ready to support critical mission needs of DOE, other Federal agencies, and industry, when the first DOE exascale systems are delivered in 2021. The software ecosystem needs to evolve both to support new functionality demanded by applications and to use new hardware features efficiently. We are utilizing a co-design approach that uses over two dozen applications to guide the development of supporting software and R&D on hardware technologies as well as feedback from the latter to influence application development.

Apply now for Argonne Training Program on Extreme-Scale Computing 2018

Computational scientists now have the opportunity to apply for the upcoming Argonne Training Program on Extreme-Scale Computing (ATPESC). The event takes place from July 29-August 10, 2018 in greater Chicago. “With the challenges posed by the architecture and software environments of today’s most powerful supercomputers, and even greater complexity on the horizon from next-generation and exascale systems, there is a critical need for specialized, in-depth training for the computational scientists poised to facilitate breakthrough science and engineering using these amazing resources.”

Supercomputing Earthquakes in the Age of Exascale

Tomorrow’s exascale supercomputers will enable researchers to accurately simulate the ground motions of regional earthquakes quickly and in unprecedented detail. “Simulations of high frequency earthquakes are more computationally demanding and will require exascale computers,” said David McCallen, who leads the ECP-supported effort. “Ultimately, we’d like to get to a much larger domain, higher frequency resolution and speed up our simulation time.”

Exascale Computing to Accelerate Clean Fusion Energy

In this special guest feature, Jon Bashor from LBNL writes that Exascale computing will accelerate the push toward clean fusion energy. “Turning this from a promising technology into a mainstream scientific tool depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales.”

Video: The DOE Exascale Computing Project

Doug Koethe from ORNL gave this talk at the HPC User Forum in Milwaukee. “The Exascale Computing Project (ECP) was established with the goals of maximizing the benefits of high-performance computing (HPC) for the United States and accelerating the development of a capable exascale computing ecosystem.”