In this special feature report, Thomas Thurston looks at the reasons why the HPC market hasn’t grown as fast as expected. Is it time for high performance computing to get more disruptive?
Why doesn’t the HPC industry grow more? While HPC has certainly had its booms and busts, hardly a conference, panel or roundtable goes by without the issue of frustrated industry growth rolling down the aisle. Some estimates put the industry’s growth at a paltry 5% annual average between 1999 – 2009.i While there’s no shortage of speculation as to the cause, perhaps it’s auspicious timing that Harvard Professor Clayton Christensen has been slated to deliver the keynote address at SC10 in New Orleans next month. He is, after all, the Yoda of disruptive innovation.
It turns out, not all innovation is created equal. In Christensen parlance, innovation is either “sustaining” or “disruptive,” with differing growth consequences to be expected in each case. The challenge for HPC is that, by its very nature, the industry tends to disproportionately favor sustaining innovation rather than disruptive. This, in turn, systemically holds back the full growth potential of HPC.
It’s hard to be a customer = Low industry growth
For an industry to grow its overall revenue (i.e. size) there must be more deals, larger deals, or both. Price x Quantity. Another way to frame this is in terms of “cost” and “access.” In the case of HPC, “cost” relates to the towering financial thresholds that customers must cross before they can even consider an HPC system. “Access” relates to the elite knowledge, skill and sophistication that customers must also possess in order to run, maintain and benefit from HPC solutions. Unless the HPC industry profoundly lowers costs or increases access (or both), there simply won’t be enough customers who can utilize HPC – and industry growth will suffer.
Sustaining Innovation = Higher Cost x Lower Access
With that in mind, there is also a dangerous myth that innovation inevitably lowers cost, and that costs reliably decline in a competitive market. As captured by Christensen in The Innovator’s Prescription, “sustaining” innovations – favored by HPC – tend to preserve or increase costs. Meanwhile “disruptive” innovations tend to bring costs down.iii
“Sustaining” innovations are defined by offering better performance in mainstream markets. For example, in 1976 radiographs (x-rays) cost upwards of US$200K and were sold by fierce global competitors such as General Electric, Siemens, Philips, Hitachi and Toshiba. In the struggle for market share, each competitor routinely came out with ingenious sustaining innovations year after year. Higher image resolutions, 3D and finer scanning sensitivities were delivered as x-ray technologies evolved into what is now computed tomography (CAT-scans).
Through such sustaining innovations General Electric, Siemens, Philips, and others could justify more than $400K for scanners in the mid-1980s, and upwards of $1 million for new scanners today. The technology got better, which is why prices remained high.
Like radiography, HPC has been prone to sustaining innovation. This is because, almost by definition, HPC exists to push the bleeding edge of ever-higher performance. For example, in 1984 it cost around $15 million for a Cray X-MP/48 with a theoretical peak performance of over 800 MFLOPS. Twenty-three years of breathtaking HPC innovation later, IBM’s Roadrunner broke the petaFLOPS barrier with systems costing upwards of $50-$60 million. While several price-per-performance ratios improved, the sticker shock increased more than three-to-four-fold and systems grew increasingly complex; daunting hurdles for many would-be customers.
Disruptive Innovation = Lower Cost x Higher Access
Unlike “sustaining” innovations, “disruptive” innovations tend to bring costs down. Rather than delivering higher performance to mainstream markets, disruptive innovations often target lower cost solutions that are “good enough” for a sub-segment of the market, but lower performing than the alternatives. They can also be solutions designed to specifically allow the less skilled or less wealthy to do what was previously done only by those with greater skill or resources.
For example, Complete Genomics is a service that sequences human genomes for medical research. The first-ever human genome took around 13 years and $3 billion to sequence in 2003, whereas today Complete Genomics offers highly automated, HPC-driven sequencing for only $5K per genome. While Complete Genomics offers a relatively limited set of analyses compared with more sophisticated wet labs, it is allowing a large population of less sophisticated medical researchers to begin analyzing genomic data in a manner that was previously beyond their means and expertise. HPC is making genomics more accessible and lowering costs.
Similarly, Linux and other open source tools have long had a disruptive impact on HPC. While not historically as high performance as incumbent solutions (such as proprietary Unix-based systems), the openness, accessibility and dramatically lower cost of Linux allowed it to enter from below and gradually increase performance over time. By doing so it has not only taken share from existing competitors, but it has also expanded the overall market for high-end systems. Lower cost, more access, more industry growth.
Past ≠ Future
This is not to say that sustaining innovation isn’t important. For the record, it is important. It is also not to say that HPC is never disruptive – the Complete Genomics and Linux examples show HPC can be disruptive (both in terms of end-user applications and the core HPC technology itself). Rather, the message here is that different types of innovation tend to produce different effects on firm and industry growth. One has a tendency to be exclusive, whereas the other tends to be inclusive.
Neither sustaining nor disruptive innovation are easy. Yet the HPC industry is nothing if not outstanding at the sustaining type; after all, on average HPC performance increases by two orders of magnitude every decade. This is not the primary holdup. Rather, perhaps it is time to ask how HPC can become more disruptive.
How can HPC tools more deliberately enable lower cost and greater access solutions?
How can HPC allow large populations of less skilled customers with fewer resources to begin doing what was previously only done by those with greater skill and wealth?
These are the new design challenges that HPC firms, the Titans of sustaining innovation, must wrap their heads around if they are serious about growth. It’s time to re-balance the equation.
About the author: Thomas Thurston is a renowned thought leader in specialized bodies of corporate strategy and investment methodology. He is currently President and Managing Director of Growth Science International.