Is the Party Over for Exascale Ambitions?

Peter Kogge writes in IEEE Spectrum that we wont be seeing an Exascale supercomputer anytime soon.

We consulted with scores of other engineers on particular new technologies, we made dozens of presentations to our DARPA sponsors, and in the end we hammered out a 278-page report [PDF], which had lots of surprises, even for us. The bottom line, though, was rather glum. The practical exaflops-class supercomputer DARPA was hoping for just wasn’t going to be attainable by 2015. In fact, it might not be possible anytime in the foreseeable future. Think of it this way: The party isn’t exactly over, but the police have arrived, and the music has been turned way down.

Kogge goes on to say that even if power and heat problems can be solved, there would still be the monumental challenge of getting 160 Million cores to work together efficiently.

Update: A 2008 IEEE podcast interview with Peter Kogge discussing Exascale computing is also available. Download the MP3.

Comments

  1. Was there an application DARPA had in mind beyond benchmarks? Have we already exhausted petascale computing?

  2. “It would be tough to keep even a small fraction of those processors busy at the same time. Realistic applications running on today’s supercomputers typically use only 5 to 10 percent of the machine’s peak processing power at any given moment.”

    That seems pretty wasteful to me, can’t they run Folding@Home, World Community Grid or something else when the superPC isn’t doing anything useful?