Podcast: Earth and Space Science for Exascale

Print Friendly, PDF & Email

In this podcast, Anshu Dubey of Argonne National Laboratory describes the Earth and Space Science application portfolio in the Exascale Computing Project (ECP).

The Earth and Space Science efforts look at fundamental questions, from the origin of the universe and chemical elements to planetary processes and interactions affecting life and longevity. These foundational questions are areas to which Earth and Space Science is applying its substantial computational knowledge and capabilities. They are typically phenomena for which controlled and fine resolution data collection is extremely difficult or infeasible, and, in many cases, fundamental simulations are our best source of data to confirm scientific observations.

The Earth and Space Science application project portfolio consists of ExaSky (Salman Habib, principal investigator [PI], Argonne National Laboratory)—unraveling the mysteries surrounding the structure of the universe; ExaStar (Daniel Kasen, PI, Lawrence Berkeley National Laboratory [LBNL])—pursuing answers to questions about where the higher-weight elements were created, and thus the processes that led to heavier nuclei generation; EQSIM (David McCallen, PI, LBNL)—modeling earthquakes and their impact on structures within an earthquake zone; Subsurface (Carl Steefel, LBNL)—probing the cracks in well bores and reservoirs to reliably inform decisions pertaining to carbon capture, fossil fuel extraction, and waste disposal; and E3SM-MMF (Mark Taylor, PI, Sandia National Laboratories)—focusing on resolving clouds at an unprecedented scale with advanced parameterization.

Interested in all aspects of high-performance computing (HPC), Dubey finds software architecture and process design to be the most exciting. She is the chief architect of FLASH, a multiphysics HPC scientific software used by numerous science and engineering domains. In existence for 20 years, FLASH was first applied as a means of simulating astrophysical phenomena, such as nova and supernova, with funding from an academic alliance program within DOE’s National Nuclear Security Administration.

With a framework flexible enough to allow other scientific communities to add their own specific physics capabilities, the code was eventually adopted by number of different scientific domains and now serves as their community code. “FLASH started as sort of an amalgamation of three completely independent pieces of code, and then it went through two major architectural revisions,” Dubey said. ”What we’re doing under the Exascale Computing Project is re-architecting it yet again to make it effective on the forthcoming highly heterogeneous platforms in exascale.”

Heterogeneity, or diversity, is a factor not only in the computing platforms but also in the mathematical solvers the Earth and Space Science teams are addressing.

As scientific understanding grows, the model to be simulated becomes higher in fidelity, and that means the solvers become more heterogeneous. Adding fidelity requires a more sophisticated numerical method or new solvers that stress the hardware differently,” Dubey said.

Each of the Earth and Space applications will have some component of a boutique solution, or capability to solve a specific problem, as well as broader uses. “By and large, these applications are solving partial differential equations, and so there is that generality,” Dubey said. “Most times, the range of scales is so huge that you cannot resolve every scale, so then you have to do something called subgrid models, which can be very boutique.”

Dubey said that sustained, stable funding from ECP has enabled the project teams to push themselves and pursue answers to questions they hadn’t considered before. A case in point is the earthquake simulation project, EQSIM.

In that community, it is very obvious that they hadn’t really been exposed to high-performance computing in any big way,” Dubey said. However, one of the major thrusts of EQSIM has been to resolve the impact of ground vibrating in the frequency range of interest to the infrastructure.

“For example, they were able to do simulations of buildings up at 1- to 2-hertz level, but in order to have any realistic insight, you need to get to 9 to 10 hertz,” Dubey said. “So, once they were given this opportunity, they threw all of their efforts into making their models sophisticated enough to be able to do this, and now they are able to run simulations in a reasonable amount of time at 10 hertz. I would say that is almost revolutionizing the field in terms of its computational abilities and insights. And similar exciting things are happening in all other fields.”

A majority of the science codes the teams are developing can, with some tweaking, be applied to a much wider class of problems. For example, the ExaStar project’s work on FLASH could benefit other science domains like biomechanics or complex physical processes. Similarly, embedded boundary methods in the Subsurface project is a technology that could be applied to a wide range of uses.

So, I think what this is doing—in addition to scientific results—is actually producing very sophisticated tools for a lot of science domains to be able to work on large scales and get more insights,” Dubey said.

Source: Scott Gibson at the Exascale Computing Project

Download the MP3

Sign up for our insideHPC Newsletter