Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


NERSC Announces Exascale Science Application Program Projects

NERSClogocolorNERSC has accepted a selection of key DOE science projects into its NERSC Exascale Scientific Applications Program, a collaborative effort in which NERSC will partner with code teams to prepare for the NERSC-8 Cori manycore architecture.

NESAP represents an important opportunity for researchers to prepare application codes for the new architecture and to help advance the missions of the Department of Energy’s Office of Science. The NESAP partnership will allow 20 projects to collaborate with NERSC, Cray, and Intel by providing access to early hardware, special training and preparation sessions with Intel and Cray staff. Eight of those 20 will also have an opportunity for a postdoctoral researcher to investigate computational science issues associated with energy-efficient manycore systems. In addition, about 24 more projects will participate in NESAP via NERSC training sessions and early access to prototype and production hardware.

The selected projects were chosen based on computational and scientific reviews by NERSC and other DOE staff. NESAP will begin during Fall 2014 and remain active as the Cori system is delivered to NERSC in mid-2016. During this period, twenty of the project teams, guided by NERSC, Cray, and Intel, will undertake intensive efforts to adapt software to take advantage of Cori’s Knights Landing manycore architecture and to use the resultant codes to produce pathbreaking science on an architecture that may represent an approach to exascale systems.

The 20 projects included in the NESAP partnership with NERSC, Cray and Intel are listed below by the program office that manages their NERSC allocation.

Advanced Scientific Computing Research (ASCR):

  • Optimization of the BoxLib Adaptive Mesh Refinement Framework for Scientific Application Codes, PI: Ann Almgren (Lawrence Berkeley National Laboratory)
  • Advanced Scientific Computing Research (ASCR): Optimization of the BoxLib Adaptive Mesh Refinement Framework for Scientific Application Codes, PI: Ann Almgren (Lawrence Berkeley National Laboratory)

Biological and Environmental Research (BER)

  • CESM Global Climate Modeling, John Dennis (National Center for Atmospheric Research)
  • High-Resolution Global Coupled Climate Simulation Using The Accelerated Climate Model for Energy (ACME), Hans Johansen (Lawrence Berkeley National Laboratory)
  • Multi-Scale Ocean Simulation for Studying Global to Regional Climate Change, Todd Ringler (Los Alamos National Laboratory)
  • Gromacs Molecular Dynamics (MD) Simulation for Bioenergy and Environmental Biosciences, Jeremy C. Smith (Oak Ridge National Laboratory)
  • Meraculous, a Production de novo Genome Assembler for Energy-Related Genomics Problems, Katherine Yelick (Lawrence Berkeley National Laboratory)

Basic Energy Science (BES):

  • Large-Scale Molecular Simulations with NWChem, PI: Eric Jon Bylaska (Pacific Northwest National Laboratory)
  • Parsec: A Scalable Computational Tool for Discovery and Design of Excited State Phenomena in Energy Materials, James Chelikowsky (University of Texas, Austin)
  • BerkeleyGW: Massively Parallel Quasiparticle and Optical Properties Computation for Materials and Nanostructures (Jack Deslippe, NERSC)
  • Materials Science using Quantum Espresso, Paul Kent (Oak Ridge National Laboratory)
  • Large-Scale 3-D Geophysical Inverse Modeling of the Earth, Greg Newman (Lawrence Berkeley National Laboratory)

Fusion Energy Sciences (FES)

  • Understanding Fusion Edge Physics Using the Global Gyrokinetic XGC1 Code, Choong-Seock Chang (Princeton Plasma Physics Laboratory)
  • Addressing Non-Ideal Fusion Plasma Magnetohydrodynamics Using M3D-C1, Stephen Jardin (Princeton Plasma Physics Laboratory)

High Energy Physics (HEP)

  • HACC (Hardware/Hybrid Accelerated Cosmology Code) for Extreme Scale Cosmology, Salman Habib (Argonne National Laboratory)
  • The MILC Code Suite for Quantum Chromodynamics (QCD) Simulation and Analysis, Doug Toussaint (University of Arizona)
  • Advanced Modeling of Particle Accelerators, Jean-Luc Vay, Lawrence Berkeley National Laboratory)

Nuclear Physics (NP)

  • Domain Wall Fermions and Highly Improved Staggered Quarks for Lattice QCD, Norman Christ (Columbia University) and Frithjof Karsch (Brookhaven National Laboratory)
  • Chroma Lattice QCD Code Suite, Balint Joo (Jefferson National Accelerator Facility)
  • Weakly Bound and Resonant States in Light Isotope Chains Using MFDn — Many Fermion Dynamics Nuclear Physics, James Vary and Pieter Maris (Iowa State University)

Codes and NESAP principal investigators associated with NESAP projects enjoying access to NERSC training sessions and early access to prototype hardware include:

  • GTC-P (Stephane Ethier/PPPL)
  • GTS (William Tang/PPPL)
  • VORPAL (John Cary/TechX)
  • TOAST (Julian Borrill/LBNL)
  • Qbox/Qb@ll (Yosuke Kanai/U. North Carolina)
  • CALCLENS and ROCKSTAR (Risa Wechsler/Stanford)
  • WEST (Marco Govoni/U. Chicago)
  • QLUA (William Detmold/MIT)
  • P3D (James Drake/U. Maryland)
  • WRF (John Michalakes/ANL)
  • PHOSIM (Andrew Connolly/U. Washington)
  • SDAV tools (Hank Childs/U. Oregon)
  • M3D/M3D-K (Linda Sugiyama/MIT)
  • DGDFT (Lin Lin/U.C. Berkeley)
  • GIZMO/GADGET (Joel Primack/U.C. Santa Cruz)
  • ZELMANI (Christian Ott/Caltech)
  • VASP (Martijn Marsman/U. Vienna)
  • NAMD (James Phillips/U. Illinois)
  • PHOENIX-3D (Eddie Baron/U. Oklahoma)
  • ACE3P (Cho-Kuen Ng/SLAC)
  • S3D (Jacqueline Chen/SNL)
  • ATLAS (Paolo Calafiura/LBNL)
  • BBTools genomics tools (Jon Rood/LBNL, JGI)
  • DOE MiniApps (Alice Koniges, LBNL)

Sign up for our insideHPC Newsletter.

Resource Links: