Podcast: Co-Design for Online Data Analysis and Reduction at Exascale

Print Friendly, PDF & Email

Ian Foster of Argonne National Laboratory

In this Let’s Talk Exascale podcast, Ian Foster from Argonne National Lab describes how the CODAR project at ECP is addressing the needs for data reduction, analysis, and management in the exascale era.

ECP has five co-design centers, one of which is CODAR: Co-Design for Online Data Analysis and Reduction at the Exascale.

Foster has been a scientist at Argonne National Laboratory for thirty years and is director of the lab’s Data Science and Learning Division. He was excited to work on the eight-processor parallel computers of the early 1990s and experience the progression from terascale to petascale capabilities. Now he’s ready to help bring in the next big technological advance—exascale computing.

I’m interested in the methods required to allow those systems to be used for challenging scientific problems,” Foster said.

Although exascale supercomputers will be much more powerful than today’s systems and open the door to solve intractable science problems, the traditional approach of writing out all the data produced and analyzing it later won’t be possible.

There just isn’t enough I/O capacity on those systems,” Foster said. “So we need to perform analysis online on the supercomputer as the simulation is being performed.”

Analyzing data in situ, or on the compute nodes, requires investigating new methods of data analysis and reduction and developing mechanisms to couple them with simulations. The CODAR team is addressing that need and integrating the new methods into ECP science applications.

The team works closely with different groups across ECP, including application teams facing data-analysis and reduction problems, groups developing various system software components, and people developing compression methods. They also collaborate with vendors to understand how to implement these methods efficiently on their computers. In addition, feedback transmits between CODAR and the vendors as they identify new requirements for specific areas such as system software.

CODAR is in a tight collaboration with ECP’s WDMApp project, which aims to use exascale computing to provide a whole device modeling capability for magnetically confined fusion plasmas.

“That’s a fascinating application project scientifically, certainly, but technically, they need to couple together multiple simulation components and then they have to do reduction and analysis online,” Foster said. “We’ve developed mechanisms that allow them to run all components online and reduce their data by two orders of magnitude, at this point.”

When compressing data produced by a simulation, the idea is to keep the parts that are scientifically interesting and toss those that are not. However, every application and, perhaps, every scientist, has a different definition of what “interesting” means in that context. So, CODAR has developed a system called Z-checker to enable users to monitor the compression method.

“It allows people to check the quality of a compression method from potentially dozens of different perspectives,” Foster said. “And we’re starting to see that technique being deployed pretty widely across different applications.”

Foster envisions high-performance computing simulations moving from the traditional mode of a single simulation running across an entire computer and outputting its data for later use to a far more dynamic world in which many components may be running simultaneously.

Reduction and analysis are being performed at the same time as the simulation,” he said. “In some cases, many different simulations are being performed simultaneously, and we want to be the people who provide the infrastructure that will make that possible. So, we see ourselves as ushering in a new approach to scientific computing that we think will become the norm on exascale computers.”

Download the MP3

Sign up for our insideHPC Newsletter