In this special guest feature, Doug Black from The Exascale Report writes that, while the idea of Grand Challenges is not new, the need for powerful computational tools to solve these global issues remains unchanged.
Flash back to 1992. Do you remember the ‘Blue Book’ and the HPCC program? If this is your first exposure to the ‘Grand Challenges’ you may find this quite interesting. On November 7, 2012, senior representatives of the DOE labs sent a letter to Secretary of Energy, Steven Chu to report on a Grand Challenges Workshop on Advanced Computing for Energy Innovation held in late July – early August 2012.
While the workshop recommendations focused on what it called Technical, Structural and Incentive ‘Grand Challenges’, one of its final recommendations was to establish an Advanced Computing for Energy (ACE) program within the Department of Energy. When I read this letter, I had an intense sense of déjà vu – one of those ‘here we go again’ feelings. But in a good way.
For a moment, it felt like 1992 all over again, a year of unusually high energy and high promise in the HPC community. It’s the year we really sank our teeth into the teraFLOPS challenge. It seemed the entire community rallied in support of what the first President Bush’s science advisor, Alan Bromley, labeled the Grand Challenges – referring to high performance computing and communications. Those Grand Challenges were the challenges of science.
It was the beginning of a period of powerful government and private industry collaboration referred to as the HPCC program. I pulled this quote from the program’s overview documentation: The HPCC Program is driven by the recognition that unprecedented computational power and capability is needed to investigate and understand a wide range of scientific and engineering “grand challenge” problems.
The program’s famous “Blue Book” also made this point:
The HPCC Program is the result of several years of effort on the part of senior government, industry, and academic scientists and managers to design a research agenda to extend U.S. leadership in high performance computing and networking technologies.
So, in many ways, nothing has really changed. Again, I mean this in a good way. The 2012 appeal to address the world’s ‘Grand Challenges’ is eerily similar to what we addressed 20 years ago. HPC is an ever widening circle that keeps coming around. Twenty years ago, the Grand Challenges included climate prediction and genome mapping. Today, the great need is energy innovation and saving the environment. Tomorrow, it may be food. This is HPC and that’s how HPC works, tackling as ever the need for funding and the need for urgency to apply extreme computational resources on the greatest scientific challenges of our time.