Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Exascale Computing Project Update

Doug Kothe from the Exascale Computing Project gave this talk at the HPC User Forum. “The Exascale Computing Project (ECP) is focused on accelerating the delivery of a capable exascale computing ecosystem that delivers 50 times more computational science and data analytic application power than possible with DOE HPC systems such as Titan (ORNL) and Sequoia (LLNL). With the goal to launch a US exascale ecosystem by 2021, the ECP will have profound effects on the American people and the world.”

Podcast: ECP Team Achieves Huge Performance Gain on Materials Simulation Code

The Exascale Atomistics for Accuracy, Length, and Time (EXAALT) project within the US Department of Energy’s Exascale Computing Project (ECP) has made a big step forward by delivering a five-fold performance advance in addressing its fusion energy materials simulations challenge problem. “Summit is at roughly 200 petaflops, so by the time we go to the exascale, we should have another factor of five. That starts to be a transformative kind of change in our ability to do the science on these machines.”

Interview: Knowledgebase is power for nuclear reactor developers

AI technologies are being used to help develop Next-gen nuclear energy systems that could help reduce our dependency on fossil fuels. In this special guest feature, Dawn Levy and Weiju Ren from ORNL explore the challenges and opportunities in sharing nuclear materials knowledge internationally. “A knowledgebase is more than a database. Data are just symbols representing observations or the products of observations. Knowledge is not only data, but also people’s understanding of the data.”

Supercomputing Galactic Winds with Cholla

Using the Titan supercomputer at Oak Ridge National Laboratory, a team of astrophysicists created a set of galactic wind simulations of the highest resolution ever performed. The simulations will allow researchers to gather and interpret more accurate, detailed data that elucidates how galactic winds affect the formation and evolution of galaxies.

Podcast: ExaStar Project Seeks Answers in Cosmos

In this podcast, Daniel Kasen from LBNL and Bronson Messer of ORNL discuss advancing cosmology through EXASTAR, part of the Exascale Computing Project. “We want to figure out how space and time get warped by gravitational waves, how neutrinos and other subatomic particles were produced in these explosions, and how they sort of lead us down to a chain of events that finally produced us.”

Cray Shasta Supercomputer to power weather forecasting for U.S. Air Force

Today Cray announced that their first Shasta supercomputing system for operational weather forecasting and meteorology will be acquired by the Air Force Life Cycle Management Center in partnership with Oak Ridge National Laboratory. The powerful high-performance computing capabilities of the new system, named HPC11, will enable higher fidelity weather forecasts for U.S. Air Force and Army operations worldwide. The contract is valued at $25 million.

Registration Opens for September HPC User Forum at Argonne

Registration is now open for the HPC User Forum at Argonne National Lab. “Our global steering committee representing leading HPC centers has worked with Hyperion Research to provide a powerful agenda representing key trends at the forefront of government, academic and private sector HPC use around the world. You’ll hear about recent developments in the exascale race, architectures, HPDA-AI, smart cities, cloud computing, industrial-commercial HPC and other important topics.”

Podcast: Quantum Applications are Always Hybrid

In this podcast, the Radio Free HPC team looks at inherently hybrid nature of quantum computing applications. “If you’re always going to have to mix classical code with quantum code then you need an environment that is built for that workflow, and thus we see a lot of attention given to that in the QIS (Quantum Information Science) area. This is reminiscent of OpenGL for graphics accelerators and OpenCL/CUDA for compute accelerators.”

Podcast: How Exascale Computing Could Help Boost Energy Production

In this podcast, Tom Evans, technical lead for ECP’s Energy Applications projects, shares about the motivations, progress, and aspirations on the path to the exascale. “Evans describes the unprecedented calculations expected at the exascale, the example of taking wind energy simulations much further, and the movement toward the use of more-general-purpose programming tools.”

Argonne Team Breaks Record with 2.9 Petabytes Globus Data Transfer

Today the Globus research data management service announced the largest single file transfer in its history: a team led by Argonne National Laboratory scientists moved 2.9 petabytes of data as part of a research project involving three of the largest cosmological simulations to date. “With exascale imminent, AI on the rise, HPC systems proliferating, and research teams more distributed than ever, fast, secure, reliable data movement and management are now more important than ever,” said Ian Foster.