Podcast: Simulating Galaxy Clusters with XSEDE Supercomputers

RomulusC has produced some of the highest resolution simulations ever of galaxy clusters, which can contain hundreds or even thousands of galaxies. The galaxy cluster simulations generated by supercomputers are helping scientists map the unknown universe. Credit: Butsky et al.

In this TACC podcast, researchers describe how they are using XSEDE supercomputers to run some of the highest resolution simulations ever of galaxy clusters.

An October 2019 study yielded results from RomulusC simulations, published in the Monthly Notices of the Royal Astronomical Society. It probed the ionized gas of mainly hydrogen and helium within and surrounding the intracluster medium, which fills the space between galaxies in a galaxy cluster.

Hot, dense gas of more than a million degrees Kelvin fills the inner cluster with roughly uniform metallicity. Cool-warm gas between ten thousand and a million degrees Kelvin lurks in patchy distributions at the outskirts, with greater variety of metals. Looking like the tail of a jellyfish, the cool-warm gas traces the process of galaxies falling into the cluster and losing their gas. The gas gets stripped from the falling galaxy and eventually mixes with the inner region of the galaxy cluster.

We find that there’s a substantial amount of this cool-warm gas in galaxy clusters,” said study co-author Iryna Butsky, a PhD Student in the Department of Astronomy at the University of Washington. “We see that this cool-warm gas traces at extremely different and complementary structures compared to the hot gas. And we also predict that this cool-warm component can be observed now with existing instruments like the Hubble Space Telescope Cosmic Origins Spectrograph.”

Scientists are just beginning to probe the intracluster medium, which is so diffuse that its emissions are invisible to any current telescopes. Scientists are using RomulusC to help see clusters indirectly using the ultraviolet (UV) light from quasars, which act like a beacon shining through the gas. The gas absorbs UV light, and the resulting spectrum yields density, temperature, and metallicity profiles when analyzed with instruments like the Cosmic Origins Spectrograph aboard the Hubble Space Telescope (HST).

One really cool thing about simulations is that we know what’s going on everywhere inside the simulated box,” Butsky said. “We can make some synthetic observations and compare them to what we actually see in absorption spectra and then connect the dots and match the spectra that’s observed and try to understand what’s really going on in this simulated box.”

Left: Iryna Butsky, PhD Student in the Department of Astronomy, University of Washington; Right: Tom Quinn, Professor of Astronomy, University of Washington.

They applied a software tool called Trident developed by Cameron Hummels of Caltech and colleagues that takes the synthetic absorption line spectra and adds a bit of noise and instrument quirks known about the HST.

The end result is a very realistic looking spectrum that we can directly compare to existing observations,” Butsky said. “But what we can’t do with observations is reconstruct three-dimensional information from a one-dimensional spectrum. That’s what’s bridging the gap between observations and simulations.”

One key assumption behind the RomulusC simulations supported by the latest science is that the gas making up the intracluster medium originates at least partly in the galaxies themselves. “We have to model how that gas gets out of the galaxies, which is happening through supernovae going off, and supernovae coming from young stars,” said study co-author Tom Quinn, a professor of astronomy at the University of Washington. That means a dynamic range of more than a billion to contend with.

What’s more, clusters don’t form in isolation, so their environment has to be accounted for.

Then there’s a computational challenge that’s particular to clusters. “Most of the computational action is happening in the very center of the cluster. Even though we’re simulating a much larger volume, most of the computation is happening at a particular spot. There’s a challenge of, as you’re trying to simulate this on a large supercomputer with tens of thousands of cores, how do you distribute that computation across those cores?” Quinn said.

Over the course of my career, NSF’s ability to provide high-end computing has helped the overall development of the simulation code that produced this,” said Quinn. “These parallel codes take a while to develop. And XSEDE has been supporting me throughout that development period. Access to a variety of high-end machines has helped with the development of the simulation code.”

RomulusC started out as a proof-of-concept with friendly user time on the Stampede2 system at the Texas Advanced Computing Center (TACC), when the Intel Xeon Phil (“Knights Landing”) processors first became available. “I got help from the TACC staff on getting the code up and running on the many-core, 68 core per chip machines.”

Quinn and colleagues eventually scaled up RomulusC to 32,000 processors and completed the simulation on the Blue Waters system of the National Center for Supercomputing Applications. Along the way, they also used the NASA Pleiades supercomputer and the XSEDE-allocated Comet system at the San Diego Supercomputer Center, an Organized Research Unit of the University of California San Diego.

Comet fills a particular niche,” Quinn said. “It has large memory nodes available. Particular aspects of the analysis, for example identifying the galaxies, is not easily done on a distributed memory machine. Having the large shared memory machine available was very beneficial. In a sense, we didn’t have to completely parallelize that particular aspect of the analysis.”

The Stampede2 supercomputer at the Texas Advanced Computing Center (left) and the Comet supercomputer at the San Diego Supercomputer Center (right) are allocated resources of the Extreme Science and Engineering Discovery Environment (XSEDE) funded by the National Science Foundation (NSF). Credit: TACC, SDSC.”Without XSEDE, we couldn’t have done this simulation,” Quinn recounted. “It’s essentially a capability simulation. We needed the capability to actually do the simulation, but also the capability of the analysis machines.”
The next generation of simulations are being made using the NSF-funded Frontera system, the fastest academic supercomputer and currently the #5 fastest system in the world, according to the November 2019 Top500 list. “Right now on Frontera, we’re doing runs at higher resolution of individual galaxies,” Quinn said.

Since we started these simulations, we’ve been working on proving how we model the star formation. And of course we have more computational power, so just purely higher mass resolution, again, to make our simulations of individual galaxies more realistic. More and bigger clusters would be good too,” Quinn added. Alyson Brooks of Rutgers University is the project investigator for the Frontera work.

Said Butsky: “What I think is really cool about using supercomputers to model the universe is that they play a unique role in allowing us to do experiments. In many of the other sciences, you have a lab where you can test your theories. But in astronomy, you can come up with a pen and paper theory and observe the universe as it is. But without simulations, it’s very hard to run these tests because it’s hard to reproduce some of the extreme phenomena in space, like temporal scales and getting the temperatures and densities of some of these extreme objects. Simulations are extremely important in being able to make progress in theoretical work.”

The study, “Ultraviolet Signatures of the Multiphase Intracluster and Circumgalactic Media in the RomulusC Simulation,” was published in October of 2019 in the Monthly Notices of the Royal Astronomical Society. The study co-authors are Iryna S. Butsky, Thomas R. Quinn, and Jessica K. Werk of the University of Washington; Joseph N. Burchett of UC Santa Cruz, and Daisuke Nagai and Michael Tremmel of Yale University. Study funding came from the NSF and NASA.

Source: TACC

Download the MP3

Sign up for our insideHPC Newsletter