There’s an unsung hero in Livermore Lab’s announcement this week regarding heroic progress made on fusion energy. It was unsung even by Livermore. It’s HPC.
Two supercomputers powered the research of hundreds of scientists at Livermore’s National Nuclear Security Administration facility:
— Sierra, the IBM supercomputer installed at NNSA in 2018 and currently ranked number six on the TOP500 list of the world’s most powerful supercomputers.
— JADE, a Penguin Solutions system, installed in 2016 and now ranked no. 258 on the TOP500, powered by Intel Xeon CPUs.
On Dec. 5, a team at LLNL’s National Ignition Facility (NIF) conducted what the lab said is the first controlled fusion experiment, also known as scientific energy breakeven, meaning the production of more energy from fusion than the laser energy used to drive it. LLNL’s experiment surpassed the fusion threshold by delivering 2.05 megajoules (MJ) of energy to the target, resulting in 3.15 MJ of fusion energy output, demonstrating for the first time a most fundamental science basis for inertial fusion energy (IFE). The lab said the research provides new capabilities to support NNSA’s Stockpile Stewardship Program and insights into clean fusion energy that, ultimately, could contribute to a future net-zero carbon energy source.
The lab also noted that “Many advanced science and technology developments are still needed to achieve simple, affordable IFE to power homes and businesses, and DOE is currently restarting a broad-based, coordinated IFE program in the United States.”
NIF researchers have been studying nuclear fusion for more than a decade, using lasers to create conditions that cause hydrogen atoms to fuse and release vast amounts of energy. Since the facility began operations in 2009, achieving a fusion reaction that produces a net gain of energy had eluded scientists.
Until now.
We spoke with Brian Spears, a Livermore physicist and PI who does radiation-hydrodynamics simulations of inertial confinement fusion (ICF) implosions at the NIF. He said the fusion team performed hundreds of thousands of simulations leading to the breakthrough.
“Our laboratories have two pillars of excellence, in simulation and high performance computing, and in large-scale experimentation,” he said. “And with this, the net result was just a fantastic demonstration of what HPC can do for us. In fact, before this shot, we made an integrated prediction that said, for the first time, the most likely thing that would happen was ignition, getting more energy out of the target than was put in by the laser. And that’s exactly what happened.”
Much of the fusion work was conducted on Hydra, a multi-physics simulation code co-developed at Livermore by Michael “Marty” Marinak, a physicist and PI in the NIF with a focus on its Inertial Confinement Fusion device
H
YDRA is used to simulate experiments carried out at the NIF and other high energy density physics facilities. The code has packages to simulate radiation transfer, atomic physics, hydrodynamics, laser propagation and other physics effects. HYDRA has over 1 million lines of code and includes both MPI and thread-level (OpenMP and pthreads) parallelism.
“It’s a massively parallel integrated radiation hydrodynamics code,” Spears said. “It’s deployed across all of our platforms here,” on both the older, CPU-based JADE system and the hybrid NVIDIA GPU- and IBM POWER CPU-based Sierra.
Spears said the team of scientists conducted two types of simulation: high fidelity, highly resolved 2D and 3D simulations for what he called “capability demonstrations,” and they conducted “moderate fidelity modes to probe the design space in very high dimensions.
“And because we have large machines, we can lay down 100,000 or 200,000 simulations, still at relatively high fidelity, each one of those 100,000, or 200,000, may take half a day or a day on a full node of a platform that we’re using. So we really push in both directions, both the capability and the capacity.”
The fusion ignition project also utilized a new capability using AI and machine learning that leveraged the GPUs in Sierra for lower precision compute. “We’ve built neural network models on top of those high capacity and high fidelity simulations,” Spears said. “So we’re really filling out the full spectrum of the computation space.”
Under tongue-in-cheek questioning as to why the fusion ignition announcement made no mention of Sierra’s and JADE’s role, Spears hastened to give credit to the supercomputers.
“This breakthrough, and that facility (NIF) that we shot it in, none of that exists without HPC,” he said. “In fact, the pressures that we put on those codes and the difficulty of predicting the experiments, because they are so demanding, has meant that HPC has kicked open new doors for entirely new things,” including COVID and other medical therapeutics research and work on nuclear stockpile stewardship.
“We’ve developed AI and machine learning technologies to bridge those codes to the actual experiment data so that we could make this very accurate prediction and know that the most likely thing that was going to happen was (fusion) ignition,” Spears said. “And those tools that we build to join simulation with experiment are equally applicable in the space where we’re trying to assure our biological safety as a country as they are for understanding fusion and taking care of our stockpile.”
“We built this facility on the shoulders of HPC,” he said. “The laboratory and its mission push the envelope for making HPC be what it is today…. We’re moving to new ways of computing that join high precision compute with machine learning and AI to do more rapid discovery work in the future. We’ve learned so much in the process, and none of it happens without HPC.”