Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

CoEC Targets Combustion Breakthroughs with Exascale Computing

Barcelona, 29 October 2020 – The European Union is committed to achieving net-zero greenhouse gas emissions by 2050. To reach this goal, there is a need for coordinated research and innovation efforts to make low and zero-carbon solutions economically viable. The recently launched Center of Excellence in Combustion (CoEC) addresses this challenge using advanced modelling and simulation […]

The Hyperion-insideHPC Interviews: NERSC’s Jeff Broughton on the End of the Top500 and Exascale Begetting Petaflops in a Rack

NERSC’s Jeff Broughton career extends back to HPC ancient times (1979) when, fresh out of college, he was promoted to a project management role at Lawrence Livermore National Laboratory – a big job for a young man. Broughton has taken on big jobs in the ensuing 40 years. In this interview, he talks about such […]

SC20: IDEAS Productivity Team Announces Software Events

The IDEAS Productivity Team and others in the HPC community are organizing software-related events at SC20, Nov. 9-19. IDEAS is a family of projects supported by the U.S. Department of Energy addressing challenges in HPC software development productivity and software sustainability in computational science and engineering. One of them, IDEAS-ECP, is supported by DOE’s Exascale Computing Project to […]

Getting to Exascale: Nothing Is Easy

In the weeks leading to today’s Exascale Day observance, we set ourselves the task of asking supercomputing experts about the unique challenges, the particularly vexing problems, of building a computer capable of 10,000,000,000,000,000,000 calculations per second. Readers of this publication might guess, given Intel’s trouble producing the 7nm “Ponte Vecchio” GPU for its delayed Aurora system for Argonne National Laboratory, that compute is the toughest exascale nut to crack. But according to the people we interviewed, the difficulties of engineering exascale-class supercomputing run the systems gamut. As we listened to exascale’s daunting litany of technology difficulties….

Where Have You Gone, IBM?

The company that built the world’s nos. 2 and 3 most powerful supercomputers is to all appearances backing away from the supercomputer systems business. IBM, whose Summit and Sierra CORAL-1 systems set the global standard for pre-exascale supercomputing, failed to win any of the three exascale contracts, and since then IBM has seemingly withdrawn from the HPC systems field. This has been widely discussed within the HPC community for at least the last 18 months. In fact, an industry analyst told us that as long ago as the annual ISC Conference in Frankfurt four years ago, he was shocked when IBM told him the company was no longer interested in the HPC business per se….

What May Come from Exascale? Improved Medicines, Longer-range Batteries, Better Control of 3D Parts, for Starters

As Exascale Day (Oct. 18) approaches, we thought it appropriate to post a recent article from Scott Gibson of the Exascale Computing Project (ECP), an overview of the anticipated advances in scientific discovery enabled by exascale-class supercomputers. Much of this research will focus on atomic physics and its impact on such areas as catalysts used in industrial conversion, molecular dynamics simulations and quantum mechanics used to develop new materials for improved medicines, batteries, sensors and computing devices.

Exascale Day: Goodyear’s CTO Talks Exascale’s Coming Industrial Design Advantages

It’s Exascale Awareness Week, the lead-up to Exascale Day this Sunday, Oct. 18 (1018), and while we mainly hear about the anticipated benefits of exascale-class computing for scientific discovery, there is also the economic competitiveness motive for exascale as well. In this video produced by DOE’s Exascale Computing Project (ECP), Goodyear’s Chief Technology Officer Chris […]

DOE Under Secretary for Science Dabbar’s Exascale Update: Frontier to Be First, Aurora to Be Monitored

As Exascale Day (October 18) approaches, U.S. Department of Energy Under Secretary for Science Paul Dabbar has commented on the hottest exascale question of the day: which of the country’s first three systems will be stood up first? In a recent, far-reaching interview with us, Dabbar confirmed what has been expected for more than two months, that the first U.S. exascale system will not, as planned, be the Intel-powered Aurora system at Argonne National Laboratory. It will instead be HPE-Cray’s Frontier, powered by AMD CPUs and GPUs and designated for Oak Ridge National Laboratory.

SC20 Keynote: Climate Science in the Age of Exascale with Dr. Bjorn Stevens

 SC20 has announced its keynote speaker, Prof. Bjorn Stevens of the Max-Planck-Institute for Meteorology in Germany, who will speak on Monday, November 16, starting a day of plenary talks and panels at the virtual conference. Prof. Stevens will discuss how exascale computing is impacting two opportunities that are changing the face of climate science — […]

NNSA Purchasing $105M HPE Cray EX Supercomputer, to be Sited at Los Alamos

Los Alamos National Laboratory has announced a contract for a new HPE supercomputer to be installed in spring of 2022 with quadrupled performance over the existing system for the U.S. Department of Energy’s National Nuclear Security Administration (NNSA). The $105 million HPE Cray EX supercomputer, called Crossroads, will replace Cray’s Trinity system and will be […]