Last week HPCwire ran an article by Michael Wolfe at The Portland Group called Compilers and More: The Dangers of COTS Supercomputing. In it the author argues for the point of view that the current HPC market is mired by its wedding to commodity technology and lack of investment in both hardware and software.
The HPC ecosystem is in perfect balance, with little investment and innovation in both hardware and software. We’re in a precarious position now. …In response to Rob Pennington, I believe that the HPC market is too small to support an aggressive hardware business, and it’s equally true that it’s too small to support a software tools industry.
Michael doesn’t offer much hope for a resolution, however
…the tools will appear only if and when they apply to a larger market, or if some company (unlikely) or government agency (perhaps likely) chooses to make a long-term strategic investment.
As a person with libertarian leanings, I rarely think big government investments are healthy in the long term.
To round out the conversation, Joe Landman over at scalability.org has a different view. Joe argues that we have the market we have because that’s the market we were willing to pay for (he’s right). He also argues that this market will continue to respond to the stimuli we give it, but that there is reason to think that we will continue to adapt our techniques to new technologies as they become available (he discusses the specific examples of GPGPUs).
Joe’s argument has the edge of practicality — not so much pathetically waiting for the scraps as they fall off the master’s table, but recognizing that the scraps are going to fall, and we can use that information as part of a larger strategy.
Read both, and decide for yourself. Then leave a comment about what you see ahead for HPC: it is a dead market walking, or just waiting to transform yet again?