Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Video: Cray Announces First Exascale System

In this video, Cray CEO Pete Ungaro announces Aurora – Argonne National Laboratory’s forthcoming supercomputer and the United States’ first exascale system. Ungaro offers some insight on the technology, what makes exascale performance possible, and why we’re going to need it. “It is an exciting testament to Shasta’s flexible design and unique system and software capabilities, along with our Slingshot interconnect, which will be the foundation for Argonne’s extreme-scale science endeavors and data-centric workloads. Shasta is designed for this transformative exascale era and the convergence of artificial intelligence, analytics and modeling and simulation– all at the same time on the same system — at incredible scale.”

PASC19 Preview: Brueckner and Dr. Curfman-McInnes to Moderate Exascale Panel Discussion

Today the PASC19 Conference announced that Dr. Lois Curfman McInnes from Argonne and Rich Brueckner from insideHPC will moderate a panel discussion with thought leaders focused on software challenges for Exascale and beyond. “In this session, Lois Curfman McInnes from Argonne National Laboratory and Rich Brueckner from insideHPC will moderate a panel discussion with thought leaders focused on software challenges for Exascale and beyond – mixing “big picture” and technical discussions. McInnes will bring her unique perspective on emerging Exascale software ecosystems to the table, while Brueckner will illustrate the benefits of Exascale to world-wide audiences.”

Video: Intel and Cray to Build First USA Exascale Supercomputer for DOE in 2021

Today Intel announced plans to deliver the first exaflop supercomputer in the United States. The Aurora supercomputer will be used to dramatically advance scientific research and discovery. The contract is valued at more than $500 million and will be delivered to Argonne National Laboratory by Intel and sub-contractor Cray in 2021. “Today is an important day not only for the team of technologists and scientists who have come together to build our first exascale computer – but also for all of us who are committed to American innovation and manufacturing,” said Bob Swan, Intel CEO.”

Argonne Looks to Singularity for HPC Code Portability

Over at Argonne, Nils Heinonen writes that Researchers are using the open source Singularity framework as a kind of Rosetta Stone for running supercomputing code almost anywhere. “Once a containerized workflow is defined, its image can be snapshotted, archived, and preserved for future use. The snapshot itself represents a boon for scientific provenance by detailing the exact conditions under which given data were generated: in theory, by providing the machine, the software stack, and the parameters, one’s work can be completely reproduced.”

Machine Learning Award Powers Engine Design at Argonne

Over at Argonne, Jared Sagoff writes that automotive manufacturers are leveraging the power of DOE supercomputers to simulate the combustion engines of the future. “As part of a partnership between the Argonne, Convergent Science, and Parallel Works, engine modelers are beginning to use machine learning algorithms and artificial intelligence to optimize their simulations. Now, this alliance recently received a Technology Commercialization Fund award from the DOE to complete this important project.”

Video: Supercomputing the Secrets of Giant Stars

In this video, supercomputing power and algorithms help astrophysicists untangle giant stars’ brightness, temperature, and chemical variations. “As a star becomes redder (and cooler), it becomes more variable. That’s a pretty firm prediction from what we’ve found, and that’s going to be what’s exciting to test in detail.”

Data Science Program at Argonne Looks to Machine Learning for New Breakthroughs

Over at Argonne, Nils Heinonen writes that four new projects for the ALCF Data Science Program that will utilize machine learning, deep learning, and other artificial intelligence methods to enable data-driven discoveries across scientific disciplines. “Each project intends to implement novel machine learning techniques; some will integrate these methods with simulations and experiments, while others will pioneer uncertainty quantification and visualization to aid in the interpretation of deep neural networks.”

ALCF – The March toward Exascale

David E. Martin gave this talk at the HPC User Forum. “In 2021, the Argonne Leadership Computing Facility (ALCF) will deploy Aurora, a new Intel-Cray system. Aurora, will be capable of over 1 exaflops. It is expected to have over 50,000 nodes and over 5 petabytes of total memory, including high bandwidth memory. The Aurora architecture will enable scientific discoveries using simulation, data and learning.”

Argonne is Supercomputing Big Data from the Large Hadron Collider

Over at Argonne, Madeleine O’Keefe writes that the Lab is supporting CERN researchers working to interpret Big Data from the Large Hadron Collider (LHC), the world’s largest particle accelerator. The LHC is expected to output 50 petabytes of data this year alone, the equivalent to nearly 15 million high-definition movies—an amount so enormous that analyzing it all poses a serious challenge to researchers.

Video: Rick Stevens from Argonne on the CANDLE Project for Exascale

In this video, Mike Bernhardt from ECP discusses the CANDLE project for Exascale with Rick Stevens from Argonne. “CANDLE is endeavoring to build the software environment for solving very large-scale distributed learning problems on the DOE Leadership Computing platforms.”