Video: Readying Quantum Chromodynamics for Exascale

Print Friendly, PDF & Email

Andreas Kronfeld from Fermilab

In this video, Fermilab scientist Andreas Kronfeld discusses the LatticeQCD project for Quantum Chromodynamics. As part of the Exascale Computing Project, LatticeQCD is increasing the precision of QCD calculations to understand the properties of quarks and gluons in the Standard Model of particle physics, a theory that clarifies the basic building blocks (or fundamental particles) of the universe and how they are related.

Quantum chromodynamics (QCD) is the quantum field theory of the subatomic particles called quarks and gluons. QCD explains what is known as the strong nuclear force, the interaction that holds protons and neutrons together in atomic nuclei and shapes the structure of nearly all visible matter. Lattice QCD calculations are the scientific instrument to connect observed properties of hadrons (particles that contain quarks) to fundamental laws of quarks and gluons. This instrument serves as a critical complement to experiments such as the ones taking place at Brookhaven National Lab and at CERN to study a phenomenon called quark gluon plasma.

To interpret these experiments and many others in particle physics and all of nuclear physics, we need both the precision side and the illumination side,” said Kronfeld, principal investigator of the LatticeQCD project. “Exascale computing will be absolutely essential to extending the precision part of what we do to small nuclei and more complicated properties of protons and neutrons that we’ve been able to achieve to date,” he said. “These calculations are not only interesting in their own right because they make clear an emerging general class of fascinating physical phenomena, but they’re also central for interpreting all experiments in particle physics and nuclear physics.”

Among the achievements made possible by lattice QCD is the calculation of the masses of quarks. “These are fundamental constants of nature comparable to the electric mass, and so they exemplify the use of precision,” Kronfeld said. “We now want to extend a similar level of rigor to the neutrino sector.”

Subtle and elusive particles, neutrinos permeate the universe and pass through matter but rarely interact. The Standard Model predicted that neutrinos would have no mass, but about twenty years ago experiments revealed that they do in fact have masses, albeit tiny. Moreover, they are the most abundant particle with mass, and by learning more about them, researchers could increase understanding of the most fundamental physics in the universe.

In experiments, neutrinos are scattered off the nucleus of a carbon, oxygen, or argon atom. “We need to understand not only how the neutrino interacts with a nucleon [a proton or neutron] but also how it interacts with the whole nucleus,” Kronfeld said. “This is why it is so important to extend the precision that we’ve done for similar things to nucleons and nuclei.”

Famous for the study of neutrinos, Fermilab shoots beams of neutrinos at detectors located on site and in Minnesota. In the future, the lab will target detectors even farther away at the Deep Underground Neutrino Experiment (DUNE) under construction at the Sanford Underground Research Facility in South Dakota. More than 1,000 collaborators are working on the DUNE project, which is a leading-edge, international experiment for neutrino science and proton decay studies.

Meanwhile, as another means of exploring the nature of matter, researchers collide electrons and protons together at Jefferson Lab to get a more vivid picture of the proton. As with the neutrino experiments, theoretical calculations are required to make sense of the results—in addition, the same is true for heavy ion collision work in nuclear physics. “There has been an excellent cross talk between results from such experimentation on the one hand and lattice QCD calculations on the other,” Kronfeld said.

What we now think is that there is a critical point, a point where water vapor and liquid water and ice can coexist when you have a high enough baryon [composite subatomic particle] density,” he said. “We’ll need exascale computing to understand that point at the same time that the experimentalists are trying to discover it. Again, that’s a case where we learned qualitative and quantitative information. The first is interesting—the second is essential.”

Pre-exascale Improvements on the Path to Exascale

Kronfeld explained that the pre-exascale supercomputer Summit at the Oak Ridge Leadership Computing Facility (OLCF) at Oak Ridge National Laboratory is allowing the LatticeQCD project team to increase the feasibility of its complicated, difficult calculations and thus make expensive experiments worth the investment. He said the advances on Summit are manifested in four ways.

First, a group from the LatticeQCD project team is striving to understand how to do the computation for what is called the Dirac equation, which is central to research in electrodynamics and chromodynamics. The equation is used repeatedly in lattice QCD calculations. On Summit the team is devising and implementing better algorithms to solve the equation.

I’m excited by the improvement in the solutions to the Dirac equation the group has made,” he said. “My collaborators have come up with multigrid methods that now finally work after 20 years of dreaming about it.”

A second focus is a probe of small nuclei and associated complicated calculations. Another LatticeQCD project group is studying how to perform the calculation efficiently by mapping the details of the problem onto the architecture of Summit. “We anticipate that Frontier exascale machine will be similar, and when we learn more about Aurora, the group will be mapping to that system as well,” Kronfeld said.

Another task is to evolve what is known as a Markov chain, which Kronfeld described as attempts to create random snapshots of a process taken at various rates of speed based on the details of algorithms. The LatticeQCD project team has a group that is endeavoring to accelerate the Markov chain.

When I worked on the Markov chain as a graduate student, I wasn’t successful because, frankly, you couldn’t see the difference in speedup using the computers we had then, but it seems to be bearing fruit now—that’s personally satisfying,” he said.

The fourth area being pursued by the LatticeQCD project team on Summit is the development of software better suited to the aims of the effort. This improved software, Kronfeld explained, will be crucial to analyzing data on an exascale machine. Researchers outside ECP at the University of Edinburgh are collaborating in the work.

The Exascale Computing Project has been breathtaking to watch,” Kronfeld said. “There’s never been anything like this before. We had support at a smaller scale, but the ambition has led to improvements in algorithms that we used to dream about but didn’t have the resources, the support, and also the access to the machine, to test and verify. We had no idea how essential it would be before we started. Whoever came up with this idea really needs to be commended. I think it is a fantastic investment. These computers are not cheap, and to have people thoughtfully consider how to use them before they come online has just been brilliant.”

Source: Scott Gibson at the Exascale Computing Project

Sign up for our insideHPC Newsletter