Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Podcast: Rewriting NWChem for Exascale

From left, Sriram Krishnamoorthy (Pacific Northwest National Laboratory), Bert de Jong (Lawrence Berkeley National Laboratory), and Huub Van Dam (Brookhaven National Laboratory) of the NWChemEx project at the SC19 conference in Denver, November 2019. They lead various aspects of the project: Krishnamoorthy, computer science; de Jong, outreach and communications; and Van Dam, code testing and assessment.

In this Let’s Talk Exascale podcast, researchers from the NWChemEx project team describe how they are readying the popular code for Exascale.

NWChemEx is about a rewrite of a computational chemistry code to run larger simulations and run them faster on the coming exascale computers,” de Jong said.
Sriram Krishnamoorthy of Pacific Northwest National Laboratory, Bert de Jong of Lawrence Berkeley National Laboratory, and Hubertus Van Dam of Brookhaven National Laboratory

The project’s efforts are directed at three specific challenges. “One challenge is we didn’t design NWChem for reduced scaling methods that involve less structure as compared with dense methods, and so we are currently interested in accomplishing that in NWChemEx,” Krishnamoorthy said. “Second, the architectures that NWChem was designed for don’t exist anymore. So, we must design for today’s heterogeneous architectures. And third, NWChem uses the same recipe for distributing work irrespective of what the input is. Depending on the characteristics of the problem, you might want to change what you do, within one run or between runs. So, this input or problem-specific heterogeneity is a risk. To address this, we are developing TAMM, a framework to advance methods decoupled from architecture details, enabling quick development and continued optimization. Those are the three main axes we are trying to solve in NWChemEx.”

The NWChemEx team’s most significant success so far has been to scale coupled-cluster calculations to a much larger number of processors. “In NWChem we had the global arrays as a toolkit to be able to build parallel applications,” Van Dam said. “We now have tensor algebra for many-body methods, which essentially is a tensor library that not only takes care of storage but also adds scheduling properties to the way the work gets executed. And that has enabled us to be able to scale to much larger numbers of processors, of course, than was previously possible.”

De Jong said collaborative dynamics in the NWChemEx project are similar to what they were when the original software was created decades ago, adding, “When NWChem was designed from scratch, we had applied mathematicians and computer scientists all working toward the same goal: making this code scale and go as fast as possible to deliver science solutions to challenges that we have in our community.”

The impact that NWChemEx will have on science is tied to the calculations that are guiding the project’s development. One calculation is in biology and the other is in chemistry (catalysis). “Both of them rely on being able to accurately calculate free energies on molecular systems that are big enough to realistically represent the kinds of problems we’re interested in,” Van Dam said. “And with the code that we are developing, we will be able to do that. We will be able to run simulations for long enough and to run them at reduced cost so that we can actually tackle the problems of scale that we need to. That capability will have a big impact on how the field will move forward.”

The development of NWChemEx is a new beginning for the chemistry application and sets the stage for future innovations.

Just like NWChem was a multi-decade effort, we think of NWChemEx not as a project that’s going to end when the exascale machines are going to be up,” Krishnamoorthy said. “We think of this as the starting point for the next-generation code. In this we are not just working within the five labs and the universities that are part of this team, we are also trying to collaborate with other efforts in the National Science Foundation and elsewhere to have a long-term part in the next generation of methods to be developed and the science to be delivered.”

Source: Scott Gibson at the Exascale Computing Project

Download the MP3

Sign up for our insideHPC Newsletter

Leave a Comment

*

Resource Links: