Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Argonne Looks to Singularity for HPC Code Portability

Over at Argonne, Nils Heinonen writes that Researchers are using the open source Singularity framework as a kind of Rosetta Stone for running supercomputing code almost anywhere. “Once a containerized workflow is defined, its image can be snapshotted, archived, and preserved for future use. The snapshot itself represents a boon for scientific provenance by detailing the exact conditions under which given data were generated: in theory, by providing the machine, the software stack, and the parameters, one’s work can be completely reproduced.”

Data Science Program at Argonne Looks to Machine Learning for New Breakthroughs

Over at Argonne, Nils Heinonen writes that four new projects for the ALCF Data Science Program that will utilize machine learning, deep learning, and other artificial intelligence methods to enable data-driven discoveries across scientific disciplines. “Each project intends to implement novel machine learning techniques; some will integrate these methods with simulations and experiments, while others will pioneer uncertainty quantification and visualization to aid in the interpretation of deep neural networks.”

ALCF – The March toward Exascale

David E. Martin gave this talk at the HPC User Forum. “In 2021, the Argonne Leadership Computing Facility (ALCF) will deploy Aurora, a new Intel-Cray system. Aurora, will be capable of over 1 exaflops. It is expected to have over 50,000 nodes and over 5 petabytes of total memory, including high bandwidth memory. The Aurora architecture will enable scientific discoveries using simulation, data and learning.”

Argonne is Supercomputing Big Data from the Large Hadron Collider

Over at Argonne, Madeleine O’Keefe writes that the Lab is supporting CERN researchers working to interpret Big Data from the Large Hadron Collider (LHC), the world’s largest particle accelerator. The LHC is expected to output 50 petabytes of data this year alone, the equivalent to nearly 15 million high-definition movies—an amount so enormous that analyzing it all poses a serious challenge to researchers.

Evolving Scientific Computing at Argonne

Over at Argonne, John Spizzirri writes that the Lab has helped advance the boundaries of high-performance computing technologies through the Argonne Leadership Computing Facility (ALCF). “Realizing the promise of exascale computing, the ALCF is developing the framework by which to harness this immense computing power to an advanced combination of simulation, data analysis, and machine learning. This effort will undoubtedly reframe the way science is conducted, and do so on a global scale.”

DOE Awards 1.5 billion Hours of Computing Time at Argonne

The ASCR Leadership Computing Challenge has awarded 20 projects for a total of 1.5 billion core-hours at Argonne to pursue challenging, high-risk, high-payoff simulations. “The Advanced Scientific Computing Program (ASCR), which manages some of the world’s most powerful supercomputing facilities, selects projects every year in areas directly related to the DOE mission for broadening the community of researchers capable of using leadership computing resources, and serving national interests for the advancement of scientific discovery, technological innovation, and economic competitiveness.”

Argonne Helps to Develop all-new Lithium-air Batteries

Scientists at Argonne are helping to develop better batteries for our electronic devices. The goal is to develop beyond-lithium-ion batteries that are even more powerful, cheaper, safer and longer lived. “The energy storage capacity was about three times that of a lithium-ion battery, and five times should be easily possible with continued research. This first demonstration of a true lithium-air battery is an important step toward what we call beyond-lithium-ion batteries.”

Future HPC Leaders Gather at Argonne Training Program on Extreme-Scale Computing

Over at ALCF, Andrea Manning writes that the recent Argonne Training Program on Extreme-Scale Computing brought together HPC practitioners from around the world. “You can’t get this material out of a textbook,” said Eric Nielsen, a research scientist at NASA’s Langley Research Center. Added Johann Dahm of IBM Research, “I haven’t had this material presented to me in this sort of way ever.”

Supercomputing Jet Noise for a Quieter World

Researchers at the University of Minnesota are using Argonne supercomputers to to look for new ways to reduce the noise produced by jet engines. Among the loudest sources of human-made noise that exist, jet engines can produce sound in excess of 130 decibels. “The University of Minnesota team developed a new method based on input-output analysis that can predict both the downstream noise and the sideline noise. While it was thought that the sideline noise was random, the input-output modes show coherent structure in the jet that is connected to the sideline noise, such that it can be predicted and controlled.”

Illinois Supercomputers Tag Team for Big Bang Simulation

Researchers are tapping Argonne and NCSA supercomputers to tackle the unprecedented amounts of data involved with simulating the Big Bang. “Researchers performed cosmological simulations on the ALCF’s Mira supercomputer, and then sent huge quantities of data to UI’s Blue Waters, which is better suited to perform the required data analysis tasks because of its processing power and memory balance.”