Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


New Leaders Join Exascale Computing Project

The US Department of Energy’s Exascale Computing Project has announced three leadership staff changes within the Hardware and Integration (HI) group. “Over the past several months, ECP’s HI team has been adapting its organizational structure and key personnel to prepare for the next phase of exascale hardware and software integration.”

Podcast: Simplifying the Deployment of HPC Tools and Libraries

In this Let’s Talk Exascale podcast, Sameer Shende from the University of Oregon describes progress on the Extreme-scale Scientific Software Stack. E4S is a community effort to provide open-source software packages for developing, deploying, and running scientific applications on HPC platforms. “Container technology is promising because it enables the user to take an existing set of libraries and tools, consider the dependency metrics of a particular software product, and deploy the software efficiently. And there’s only one kernel that’s running when a container is deployed, unlike other virtualization approaches. So it’s very efficient.”

MLPerf-HPC Working Group seeks participation

In this special guest feature, Murali Emani from Argonne writes that a team of scientists from DoE labs have formed a working group called MLPerf-HPC to focus on benchmarking machine learning workloads for high performance computing. “As machine learning (ML) is becoming a critical component to help run applications faster, improve throughput and understand the insights from the data generated from simulations, benchmarking ML methods with scientific workloads at scale will be important as we progress towards next generation supercomputers.”

Podcast: Co-Design for Online Data Analysis and Reduction at Exascale

In this Let’s Talk Exascale podcast, Ian Foster from Argonne National Lab describes how the CODAR project at ECP is addressing the needs for data reduction, analysis, and management in the exascale era. “When compressing data produced by a simulation, the idea is to keep the parts that are scientifically interesting and toss those that are not. However, every application and, perhaps, every scientist, has a different definition of what “interesting” means in that context. So, CODAR has developed a system called Z-checker to enable users to monitor the compression method.”

Exascale Computing Project Announces Staff Changes Within Software Technology Group

The US Department of Energy’s Exascale Computing Project (ECP) has announced the following staff changes within the Software Technology group. Lois Curfman McInnes from Argonne will replace Jonathan Carter as Deputy Director for Software Technology. Meanwhile Sherry Li is now team lead for Math Libraries. “We are fortunate to have such an incredibly seasoned, knowledgeable, and respected staff to help us lead the ECP efforts in bringing the nation’s first exascale computing software environment to fruition,” said Mike Heroux from Sandia National Labs.

Postdoc Symposium at Berkeley Lab Looks to Exascale for Modeling and Simulation

Twenty-two postdoctoral fellows from across the Computing Sciences Area shared the status of their current projects at the first CSA Postdoc Symposium, held January 30-31 at Berkeley Lab. Their presentations covered a broad range of research topics, including code optimization, machine/deep learning, network routing, modeling and simulation of complex scientific problems, exascale, and other next-generation computer architectures.

Podcast: Solving Multiphysics Problems at the Exascale Computing Project

In this Let’s Talk Exascale Podcast, Stuart Slattery and Damien Lebrun-Grandie from ORNL describe how they are readying algorithms for next-generation supercomputers at the Department of Energy. “The mathematical library development portfolio of the Software Technology (ST) research focus area of the ECP provides general tools to implement complex algorithms. These algorithms are designed to scale up for supercomputers so that ECP teams can then use them to accelerate the development and improve the performance of science applications on DOE high-performance computing architectures.”

Interview: Exascale Computing Project Update for 2020

In this video, Exascale Computing Project Director Doug Kothe describes how disciplined and tailored project management led to very impressive results in what was likely the most comprehensive independent review of the project to date. “ECP’s products will be robust, production ready, and functional right out of the box; and ECP is driving the sharing of information through regular training not only with ECP participants but also the broader US high-performance computing community to lower barriers to using exascale systems and accelerated architectures in general.”

Podcast: Rewriting NWChem for Exascale

In this Let’s Talk Exascale podcast, researchers from the NWChemEx project team describe how they are readying the popular code for Exascale. The NWChemEx team’s most significant success so far has been to scale coupled-cluster calculations to a much larger number of processors. “In NWChem we had the global arrays as a toolkit to be able to build parallel applications.”

Podcast: Earth and Space Science for Exascale

In this podcast, Anshu Dubey of Argonne National Laboratory describes the Earth and Space Science application portfolio in the Exascale Computing Project (ECP). “By and large, these applications are solving partial differential equations, and so there is that generality,” Dubey said. “Most times, the range of scales is so huge that you cannot resolve every scale, so then you have to do something called subgrid models, which can be very boutique.”