Podcast: ExaStar Project Seeks Answers in Cosmos

Print Friendly, PDF & Email

This simulation image is of the isotopic composition of the ejecta of a core-collapse supernova as the shock breaks through the surface of the star after approximately a day from core bounce. Shown are helium (blue), carbon (green), and radioactive nickel (red). Credit: ExaStar project

In this podcast, Daniel Kasen from LBNL and Bronson Messer of ORNL discuss advancing cosmology through ExaStar, part of the Exascale Computing Project.

Prodigious and mysterious, the cosmos has many important secrets to reveal through its stars exploding as supernovae and its colliding neutron stars and black holes.

In those astrophysical explosions, you reach some of the most extreme conditions in the universe, much greater than anything we can achieve here on Earth, and so you can probe physics at new regimes,” said Daniel Kasen of Lawrence Berkeley National Laboratory and principal investigator of ExaStar, a project within the US Department of Energy’s Exascale Computing Project. “You can probe matter denser than the atomic nucleus. You can probe extreme gravitational fields that produce ripples in space time, gravitational waves.”

Scientists believe that roiling inside these colossal blasts were the cauldrons in which the heaviest elements of the universe were fused.

We think all elements heavier than hydrogen and helium were produced in stars and stellar explosions; however, we still don’t understand completely the physics of how and where that happened,” Kasen said.

ExaStar aims to create simulations for comparison with experiments and observations to help answer a variety of questions: Why is there more iron than gold in the universe? Why is anything rarer than anything else? Why is finding transuranic elements on the face of the earth difficult? “At the same time, we want to figure out how space and time get warped by gravitational waves, how neutrinos and other subatomic particles were produced in these explosions, and how they sort of lead us down to a chain of events that finally produced us,” said Bronson Messer of Oak Ridge National Laboratory (ORNL) and the ExaStar team.

The isotopic composition of the ejecta of a core-collapse supernova as the shock breaks through the surface of the star after approximately a day from core bounce

As new experimental facilities come online and unveil more about the universe, from the microscopic to the massive scales, the bridge between those scales is computer simulation, Kasen said. Yet the computational tasks involved are far from easy.

It’s difficult because we’re solving many different physics problems coupled together in a multiphysics simulation,” Kasen said. “For a supernova explosion, for example, we have to model the physics of gravity as a star collapses and dies but coupled to the hydrodynamics of how gas becomes turbulent, drives shocks and is expelled in the explosion, coupled to the nuclear reactions whereby the heavy elements were formed. Also included is radiation such as neutrinos and photons that are propagating through and producing the signals that we ultimately observe. So, that’s the grand challenge. Doing it with the requisite fidelity will require exascale computing resources.”

The pre-exascale supercomputer Summit at the Oak Ridge Leadership Computing Facility, ORNL, is helping the ExaStar team advance toward the needed precision. “Summit has allowed us to take pieces of the microphysics that we include in ExaStar simulations and really improve the physical fidelity in a lot of ways,” Messer said. “The most obvious example that we have is we were able to take the nuclear networks—the set of equations that tell us how one set of elements transmutes into another set of elements—and instead of using a schematic thirteen-element network, we were able to expand to hundreds of isotopes and run that in the same amount of time on Summit that it would take the smaller network to run on Titan. This has meant that we can actually make predictions that can be matched up to telescope observations both here on Earth and in space.”

(from left) Daniel Kasen from LBNL and Bronson Messer from ORNL

The key feature of Summit that enabled the big leap in physical fidelity was the computational power of the GPUs. “We harnessed the power to solve our equations very, very fast,” Messer said. “With the level of improved speed we’re going to get with the new set of GPUs on Frontier [the upcoming exascale supercomputer], we’ll be able to add more and more realism. We intend to take other pieces of the physics that we don’t currently have on Summit, put those on the GPUs on Frontier, and we hope to get the entire simulation to run a lot faster. So, instead of taking perhaps on the order of a month, which is what it takes us to run a supernova simulation or a neutron star merger simulation now, we hope to get that down to less than a week. And then we will actually be able to do science with those runs.”

Scientists have been simulating astrophysical explosions for many decades. Although they have made great progress in understanding the fundamental elements of the simulation, computational expense has been a barrier in getting a grasp on how the explosions happen and how the heavy elements are formed. The speedup and memory capabilities of exascale computing could remove that obstacle.

I think we can look forward to resources like Frontier and be hopeful we’ll reach a tipping point in the science whereby we’ll be able to actually treat all the key physics with the requisite fidelity,” Kasen said. “We will move from understanding the basic elements of the explosion to being able to do predictive science determining how a supernova depends on certain parameters and types of stars. We’re about to enter a future in which exascale simulations are going to be the linchpin between the microscopic- and astrophysical-scale experiments.”

Source: Scott Gibson at the Exascale Computing Project

Download the MP3