Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Reconstructing Nuclear Physics Experiments with Supercomputers

For the first time, scientists have used HPC to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries. “By running multiple computing jobs simultaneously on the allotted supercomputing cores, the team transformed 4.73 petabytes of raw data into 2.45 petabytes of “physics-ready” data in a fraction of the time it would have taken using in-house high-throughput computing resources, even with a two-way transcontinental data journey.”

Hayward Fault Earthquake Simulations Increase Fidelity of Ground Motions

Researchers at LLNL are using supercomputers to simulate the onset of earthquakes in California. “This study shows that powerful supercomputing can be used to calculate earthquake shaking on a large, regional scale with more realism than we’ve ever been able to produce before,” said Artie Rodgers, LLNL seismologist and lead author of the paper.”

Video: Deep Learning for Science

Prabhat from NERSC and Michael F. Wehner from LBNL gave this talk at the Intel HPC Developer Conference in Denver. “Deep Learning has revolutionized the fields of computer vision, speech recognition and control systems. Can Deep Learning (DL) work for scientific problems? This talk will explore a variety of Lawrence Berkeley National Laboratory’s applications that are currently benefiting from DL.”

Speeding Data Transfer with ESnet’s Petascale DTN Project

Researchers at the DOE are looking to dramatically increase their data transfer capabilities with the Petascale DTN project. “The collaboration, named the Petascale DTN project, also includes the National Center for Supercomputing Applications (NCSA) at the University of Illinois in Urbana-Champaign, a leading center funded by the National Science Foundation (NSF). Together, the collaboration aims to achieve regular disk-to-disk, end-to-end transfer rates of one petabyte per week between major facilities, which translates to achievable throughput rates of about 15 Gbps on real world science data sets.”

Supercomputing Earthquakes in the Age of Exascale

Tomorrow’s exascale supercomputers will enable researchers to accurately simulate the ground motions of regional earthquakes quickly and in unprecedented detail. “Simulations of high frequency earthquakes are more computationally demanding and will require exascale computers,” said David McCallen, who leads the ECP-supported effort. “Ultimately, we’d like to get to a much larger domain, higher frequency resolution and speed up our simulation time.”

NERSC lends a hand to 2017 Tapia Conference on Diversity in Computing

The recent Tapia Conference on Diversity in Computing in Atlanta brought together some 1,200 undergraduate and graduate students, faculty, researchers and professionals in computing from diverse backgrounds and ethnicities to learn from leading thinkers, present innovative ideas and network with peers.

Sowing Seeds of Quantum Computation at Berkeley Lab

“Berkeley Lab’s tradition of team science, as well as its proximity to UC Berkeley and Silicon Valley, makes it an ideal place to work on quantum computing end-to-end,” says Jonathan Carter, Deputy Director of Berkeley Lab Computing Sciences. “We have physicists and chemists at the lab who are studying the fundamental science of quantum mechanics, engineers to design and fabricate quantum processors, as well as computer scientists and mathematicians to ensure that the hardware will be able to effectively compute DOE science.”

IDEAS Program Fostering Better Software Development for Exascale

Scalability of scientific applications is a major focus of the Department of Energy’s Exascale Computing Project (ECP) and in that vein, a project known as IDEAS-ECP, or Interoperable Design of Extreme-scale Application Software, is also being scaled up to deliver insight on software development to the research community.

Kathy Yelick Presents: Breakthrough Science at the Exascale

UC Berkeley professor Kathy Yelick presented this talk at the 2017 ACM Europe Conference. “Yelick’s keynote lecture focused on the exciting opportunities that High Performance Computing presents, the need for advanced in algorithms and mathematics to advance along with the system performance, and how the variety of workloads will stress the different aspects of exascale hardware and software systems.”

Podcast: Mapping DNA at Near-Atomic Resolution with Cryo-EM

In this podcast, Berkeley Lab’s Eva Nogales describes how her team is using a new imaging technology that is yielding remarkably detailed 3-D models of complex biomolecules critical to DNA function. Using cryo-electron microscopy (cryo-EM), Nogales and her colleagues have resolved the structure at near-atomic resolutions of a human transcription factor used in gene expression and DNA repair.