Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Video: How OpenACC Enables Scientists to port their codes to GPUs and Beyond

In this video SC18, Jack Wells from ORNL describes how OpenACC enables scientists to port their codes to GPUs and other HPC platforms. “OpenACC, a directive-based high-level parallel programming model, has gained rapid momentum among scientific application users – the key drivers of the specification. The user-friendly programming model has facilitated acceleration of over 130 applications including CAM, ANSYS Fluent, Gaussian, VASP, Synopsys on multiple platforms and is also seen as an entry-level programming model for the top supercomputers (Top500 list) such as Summit, Sunway Taihulight, and Piz Daint. As in previous years, this BoF invites scientists, programmers, and researchers to discuss their experiences in adopting OpenACC for scientific applications, learn about the roadmaps from implementers and the latest developments in the specification.”

Podcast Looks at Exascale Computing for Forefront Scientific Problems

In this edition of Let’s Talk Exascale, Fred Streitz of Lawrence Livermore National Laboratory describes his team’s efforts to develop supercomputer applications that address forefront scientific problems by pushing the limits of leadership-class computing. “At SC18, Fred Streitz gave a talk in the US Department of Energy booth on the topic “Machine Learning and Predictive Simulation: HPC and the US Cancer Moonshot on Sierra.” As a guest on the ECP podcast, he provides an overview and some insights from his booth talk.”

Radio Free HPC Looks at TOP500 Trends on the Road to Exascale

In this podcast, the Radio Free HPC team looks at the semi-annual TOP500 BoF presentation by Jack Dongarra.

The TOP500 list of supercomputers serves as a “Who’s Who” in the field of High Performance Computing. “This BoF will present detailed analyses of the TOP500 and discuss the changes in the HPC marketplace during the past years. The BoF is meant as an open forum for discussion and feedback between the TOP500 authors and the user community.”

Video: ECP Launches Extreme-Scale Scientific Software Stack 0.1 Beta

Last week at SC18 in Dallas, the Exascale Computing Project released a portion of the next version of collaboratively developed products that compose the ECP software stack, including libraries and embedded software compilers. “Mike Heroux, ECP Software Technology director, said in an interview at SC18 that the software pieces in this release represent new capabilities and, in most instances, are highly tested and quite robust, and point toward exascale computing architectures.”

Record SC18 Attendance a Bellwether for Growth in HPC Market

The SC18 conference drew a record-breaking 13,071 attendees, making it the largest SC conference of all time. As the premier international conference showcasing high performance computing, networking, storage, and analysis, the conference and exhibition infused the local economy with more than $40 million in revenue, according to the local Dallas Convention Bureau.

The Green HPCG List and the Road to Exascale

In this special guest post, Axel Huebl looks at the TOP500 and HPCG with an eye on power efficiency trends to watch on the road to Exascale. “This post will focus one efficiency, in terms of performance per Watt, simply because system power envelope is a major constrain for upcoming Exascale systems. With the great numbers from TOP500, we try to extend theoretical estimates from theoretical Flop/Ws of individual compute hardware to system scale.”

Radio Free HPC Runs Down the TOP500 Fastest Supercomputers

In this podcast, the Radio Free HPC team looks back on the highlights of SC18 and the newest TOP500 list of the world’s fastest supercomputers.

Buddy Bland shows off Summit, the world’s fastest supercomputer at ORNL. “The latest TOP500 list of the world’s fastest supercomputers is out, a remarkable ranking that shows five Department of Energy supercomputers in the top 10, with the first two captured by Summit at Oak Ridge and Sierra at Livermore. With the number one and number two systems on the planet, the “Rebel Alliance” vendors of IBM, Mellanox, and NVIDIA stand far and tall above the others.”

Video: Intel Driving HPC on the Road to Exascale

In this video from SC18, Raj Hazra describes how Intel is driving the convergence of HPC and Ai. “To meet the new computational challenges presented by this AI and HPC convergence, HPC is expanding beyond its traditional role of modeling and simulation to encompass visualization, analytics, and machine learning. Intel scientists and engineers will be available to discuss how to implement AI capabilities into your current HPC environments and demo how new, more powerful HPC platforms can be applied to meet your computational needs now and in the future.”

Intel Continues HPC Leadership at SC18

In this video from SC18 in Dallas, Trish Damkroger describes how Intel is pushing the limits of HPC and Machine Learning with a full suite of Hardware, Software, and Cloud technologies. ”
Today’s high performance computers are unleashing discovery and insights at an unprecedented pace. The intersection of artificial intelligence and HPC has the potential to transform industries from life sciences to manufacturing, while solving some of the toughest challenges in our world. At SC18, HPC users got to experience how Intel’s holistic portfolio of products is transforming HPC from traditional modeling and simulation to visualization, analytics, and artificial intelligence.”

Bright Cluster Manager Prepares for Exascale at SC18


Bright’s exascale-capable version of Bright Cluster Manager is designed to support 100,000+ nodes. The company began delivering enhancements towards exascale in 2016 with features such as dedicated provisioning nodes, and a new monitoring subsystem designed for extreme scale. Current development work includes dedicated monitoring nodes, hierarchical object rendering in the Bright UI, optimized API communication patterns, and exascale simulation testing.”