Sign up for our newsletter and get the latest HPC news and analysis.

House Hearing: Supercomputing and American Technology Leadership

David Turek, VP, Technical Computing, IBM

“Today, we will hear from a distinguished panel of witnesses about the importance of high performance computing to American technological competitiveness, specifically focusing on the Department of Energy’s Advanced Scientific Computing Research program, also known as the “ASCR” program within the Office of Science.”

Funding the March Towards Exascale

Robert Roe

In this special guest feature from Scientific Computing World, Robert Roe writes that recent DOE funding for high end supercomputing bodes well for the continuing march to Exascale levels of computation.

Video: Bill Harrod Accepts HPC Vanguard Award at SC14


In this video, Bill Harrod from the Department of Energy accepts the HPC Vanguard Award from Rich Brueckner and Thomas Sterling at SC14. “Launched by The Exascale Report in 2013, the HPC Vanguard Award recognizes critical leaders in the HPC community’s strategic push to achieve exascale levels of supercomputing performance.”

Radio Free HPC Takes a Hard Look at the Two 2017 Coral Supercomputers


In this video, the Radio Free HPC team meets at SC14 in New Orleans to discuss the recent news that Nvidia & IBM will build two Coral 150+ Petaflop Supercomputers in 2017 for Lawrence Livermore and Oak Ridge National Laboratories. The two machines will feature IBM POWER9 processors coupled with Nvidia’s future Volta GPU technology. NVLink will be a critical piece of the architecture as well, along with a system interconnect powered by Mellanox.

This Week in HPC: Cray Creates GPU Heavy Server Node and New Exascale Recommendations for the DOE

this week in hpc

In this episode of This Week in HPC, Michael Feldman and Addison Snell from Intersect360 Research discuss the new Cray CS-Storm supercomputer based on Nvidia GPUs. After that, the discussion turns to exascale investment recommendations coming out of a new report from a Department of Energy Task Force.

Supercomputing 102: The Toolbox of a Successful Computational Scientist

Judith Hill

“Successful computational scientists are experts in both a scientific field, such as chemistry, physics, or astrophysics, knowledgeable about both mathematical representations and algorithmic implementations, and also specialize in developing and optimizing scientific application codes to run on computers, both large and small. A truly successful computational science investigation requires the “three A’s”: a compelling Application, the appropriate Algorithm, and the underlying Architecture.”

Burst Buffers and Data-Intensive Scientific Computing

Glenn Lockwood

“For those who haven’t been following the details of one of DOE’s more recent procurement rounds, the NERSC-8 and Trinity request for proposals (RFP) explicitly required that all vendor proposals include a burst buffer to address the capability of multi-petaflop simulations to dump tremendous amounts of data in very short order. The target use case is for petascale checkpoint-restart, where the memory of thousands of nodes (hundreds of terabytes of data) needs to be flushed to disk in an amount of time that doesn’t dominate the overall execution time of the calculation.”