Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


NCSA Blue Waters Report Shows Economic Benefits of HPC

The importance of supercomputing on local and national economic prosperity has been highlighted by a recent study which reported that its Blue Waters project to be worth more than $1.08 billion for the Illinois’ economy. The study was completed by the published by the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

Video: Overview of Scientific Workflows

Scott Callaghan from the Southern California Earthquake Center presented this talk as part of the Blue Waters Webinar Series. “I will present an overview of scientific workflows. I’ll discuss what the community means by “workflows” and what elements make up a workflow. We’ll talk about common problems that users might be facing, such as automation, job management, data staging, resource provisioning, and provenance tracking, and explain how workflow tools can help address these challenges. I’ll present a brief example from my own work with a series of seismic codes showing how using workflow tools can improve scientific applications.”

Video: An Overview of the Blue Waters Supercomputer at NCSA

In this video, Robert Brunner from NCSA presents: Blue Waters System Overview. “Blue Waters is one of the most powerful supercomputers in the world. Scientists and engineers across the country use the computing and data power of Blue Waters to tackle a wide range of challenging problems, from predicting the behavior of complex biological systems to simulating the evolution of the cosmos.”

Exxon Mobil and NCSA Achieve New Levels of Scalability on complex Oil & Gas Reservoir Simulation Models

“This breakthrough has unlocked new potential for ExxonMobil’s geoscientists and engineers to make more informed and timely decisions on the development and management of oil and gas reservoirs,” said Tom Schuessler, president of ExxonMobil Upstream Research Company. “As our industry looks for cost-effective and environmentally responsible ways to find and develop oil and gas fields, we rely on this type of technology to model the complex processes that govern the flow of oil, water and gas in various reservoirs.”

Supercomputing 3D Elevation Maps of Alaska on Blue Waters

Today, the National Geospatial-Intelligence Agency and NSF released 3-D topographic maps that show Alaska’s terrain in greater detail than ever before. Powered by the Blue Waters supercomputer, the maps are the result of a White House Arctic initiative to inform better decision-making in the Arctic. “We can’t live without Blue Waters now,” said Paul Morin, head of the University of Minnesota’s Polar Geospatial Center. “The supercomputer itself, the tools the Blue Waters team at NCSA developed, the techniques they’ve come up with in using this hardware. Blue Waters is changing the way digital terrain is made and that is changing how science is done in the Arctic.”

Simulating the Earliest Generations of Galaxies with Enzo and Blue Waters

“Galaxies are complex—many physical processes operate simultaneously, and over a huge range of scales in space and time. As a result, accurately modeling the formation and evolution of galaxies over the lifetime of the universe presents tremendous technical challenges. In this talk I will describe some of the important unanswered questions regarding galaxy formation, discuss in general terms how we simulate the formation of galaxies on a computer, and present simulations (and accompanying published results) that the Enzo collaboration has recently done on the Blue Waters supercomputer. In particular, I will focus on the transition from metal-free to metal-enriched star formation in the universe, as well as the luminosity function of the earliest generations of galaxies and how we might observe it with the upcoming James Webb Space Telescope.”

Extreme-scale Graph Analysis on Blue Waters

George Slota presented this talk at the Blue Waters Symposium. “In recent years, many graph processing frameworks have been introduced with the goal to simplify analysis of real-world graphs on commodity hardware. However, these popular frameworks lack scalability to modern massive-scale datasets. This work introduces a methodology for graph processing on distributed HPC systems that is simple to implement, generalizable to broad classes of graph algorithms, and scales to systems with hundreds of thousands of cores and graphs of billions of vertices and trillions of edges.”

Speakers Announced for HPC User Forum in Beijing

IDC has announced the featured speakers for the next international HPC User Forum. The event will take place Sept. 22 in Beijing, China.

Tutorial: GPU Performance Nuggets

In this video from the 2016 Blue Waters Symposium, GPU Performance Nuggets – Carl Pearson and Simon Garcia De Gonzalo from the University of Illinois present: GPU Performance Nuggets. “In this talk, we introduce a pair of Nvidia performance tools available on Blue Waters. We discuss what the GPU memory hierarchy provides for your application. We then present a case study that explores if memory hierarchy optimization can go too far.”

Video: What is Driving Heterogeneity in HPC?

Wen-mei Hwu from the University of Illinois at Urbana-Champaign presented this talk at the Blue Waters Symposium. “In the 21st Century, we are able to understand, design, and create what we can compute. Computational models are allowing us to see even farther, going back and forth in time, learn better, test hypothesis that cannot be verified any other way, and create safe artificial processes.”