Interview: Why HPC is the Right Tool for Physics

Print Friendly, PDF & Email

Over at the SC19 Blog, Charity Plata continues the HPC is Now series of interviews with Enrico Rinaldi, a physicist and special postdoctoral fellow with the Riken BNL Research Center. This month, Rinaldi discusses why HPC is the right tool for physics and shares the best formula for garnering a Gordon Bell Award nomination.

Photo Courtesy of Brookhaven National Laboratory

Rinaldi was part of a team nominated for the 2018 ACM Gordon Bell Prize for developing an algorithm and code that can more precisely determine the lifetime of a neutron. Their work directly focused on quantum chromodynamics, or QCD, a fundamental particle theory that emphasizes the overall workings of subatomic protons and neutrons. More directly, they were discretizing space and time on a four-dimensional grid, a tactic known as lattice QCD that, as Rinaldi explained, poses the added challenge of requiring a “top500 supercomputer” to pull off.

Charity: Right now, what does your research focus on and could it even be done without HPC?

Enrico: My research is deeply connected to HPC and relies on it. I am pursuing a research program on exploring theories of dark matter, a mysterious form of mass that is not visible except through its gravitational pull on regular luminous matter (gas, stars, galaxies). In these theories, dark matter is a composite object and, like in QCD, we need to use numerical approaches to get predictions on how dark matter particles move and how they might interact with us. These numerical approaches are very similar to lattice QCD. However, QCD is part of the Standard Model of particle physics and has entered an era of precision where we use HPC to reduce error bars even further. On the other hand, dark matter theories are extensions of the Standard Model, and HPC is necessary to even begin to address very simple questions.

Charity: Your lattice QCD work has engaged computing resources on two of the ‘biggest’ systems available today, Sierra (peak: 125 petaflops) at Lawrence Livermore National Laboratory and Summit (peak: 143.5 petaflops) at Oak Ridge National Laboratory. How did working with such leading-edge machines impact your research?

Enrico: Sierra and Summit are incredible machines, and we were lucky to be among the first teams to use them to produce new scientific results. The impact on my lattice QCD research was tremendous, as demonstrated by the Gordon Bell paper submission. What we achieved with more than a year of computing time on the previous system at Oak Ridge, Titan, was worth a publication in the journal Nature. With Sierra and Summit, the same result could be achieved in a few months. Not only we can expect new exciting results that improve on our current ones, but we can think of entirely new physics problems that were unthinkable before.

Charity: What do you foresee as the next frontier for your area(s) of research, and how do you expect to integrate HPC as part of it?

Enrico: Lattice QCD is currently expanding by incorporating research from Quantum Information Science and Artificial Intelligence, as an example of interdisciplinary cross-contamination. It is hard to know where this will take us, but it is easy to see the role of HPC in this. At SC18, we saw many HPC applications incorporating deep learning techniques, even scaled up to all nodes of Summit with exaflops performance, and solving important problems that can enhance our everyday lives, from weather forecasts to drug discovery. In particle physics, I hope to see similar endeavors.

Charity: Your team was nominated for a Gordon Bell Award.* What helpful hints can you provide to anyone considering seeking a nomination?

Enrico: For us, it was having a cool application with a new physics algorithm plus access to new HPC systems, which equals the ‘ideal’ for pursuing a Gordon Bell award. Trying to solve an outstanding scientific problem is another ‘plus’ when it comes to pursuing the award. We were aiming at explaining a deep mystery in the lifetime of the neutron, starting directly from the equations of the fundamental theory of the neutron’s constituents, quarks and gluons. However, without Summit or Sierra, it would have been just a ‘simple’ scientific achievement and not an award-worthy supercomputing application.

Charity: The SC19 theme is ‘HPC Is Now.’ What does that concept mean to you?

Enrico: To me, it means that HPC can be seen having effects on everyone’s lives right now, everywhere in the world. HPC applications have direct repercussions on medicine, farming, climate and more. ‘HPC Is Now’ makes it clear that we are not waiting for the future—we are the future.

Read more about this Gordon Bell-nominated work: “Simulating the Weak Death of the Neutron in a Femtoscale Universe with Near-Exascale Computing.”

SC19 takes place Nov. 17-22 in Denver.

Check out our insideHPC Events Calendar