From Grand Challenges to Critical Workflows

Print Friendly, PDF & Email
Geert Wenes, Sr. Practice Leader at Cray

Geert Wenes, Sr. Practice Leader at Cray

A major objective of President Obama’s recent Executive Order on creating a National Strategic Computing Initiative is to increase the “coherence between the technology base used for modeling and simulation and that used for data analytic computing.” Along these lines, Geert Wenes writes in the Cray Blog that the next generation of Grand Challenges will focus on critical workflows for Exascale.

For every historical HPC grand challenge application, there is now a critical dependency on a series of other processing and analysis steps, data movement and communications that goes well beyond the pre- and post-processing of yore. It is iterative, sometimes synchronous (in situ) and generally more on an equal footing with the “main” application. (In fact, there may not even be a main application any longer). There is a generous amount of quality assurance as well as validation involved. Input data can be massive and is typically sensor data. In such workflows the data is always noisy, models are always incomplete and the task is never truly done: Data gets reprocessed and analyzed de novo sometimes years later. Hence, a thorough understanding of the acquisition and processing history is essential.

Read the Full StorySign up for our insideHPC Newsletter.