Why The HPC Growth Equation Hasn’t Added Up

In this special feature report, Thomas Thurston looks at the reasons why the HPC market hasn’t grown as fast as expected. Is it time for high performance computing to get more disruptive?

Why doesn’t the HPC industry grow more? While HPC has certainly had its booms and busts, hardly a conference, panel or roundtable goes by without the issue of frustrated industry growth rolling down the aisle. Some estimates put the industry’s growth at a paltry 5% annual average between 1999 – 2009.i While there’s no shortage of speculation as to the cause, perhaps it’s auspicious timing that Harvard Professor Clayton Christensen has been slated to deliver the keynote address at SC10 in New Orleans next month. He is, after all, the Yoda of disruptive innovation.

It turns out, not all innovation is created equal. In Christensen parlance, innovation is either “sustaining” or “disruptive,” with differing growth consequences to be expected in each case. The challenge for HPC is that, by its very nature, the industry tends to disproportionately favor sustaining innovation rather than disruptive. This, in turn, systemically holds back the full growth potential of HPC.

It’s hard to be a customer = Low industry growth

For an industry to grow its overall revenue (i.e. size) there must be more deals, larger deals, or both. Price x Quantity. Another way to frame this is in terms of “cost” and “access.” In the case of HPC, “cost” relates to the towering financial thresholds that customers must cross before they can even consider an HPC system. “Access” relates to the elite knowledge, skill and sophistication that customers must also possess in order to run, maintain and benefit from HPC solutions. Unless the HPC industry profoundly lowers costs or increases access (or both), there simply won’t be enough customers who can utilize HPC – and industry growth will suffer.

Sustaining Innovation = Higher Cost x Lower Access

With that in mind, there is also a dangerous myth that innovation inevitably lowers cost, and that costs reliably decline in a competitive market. As captured by Christensen in The Innovator’s Prescription, “sustaining” innovations – favored by HPC – tend to preserve or increase costs. Meanwhile “disruptive” innovations tend to bring costs down.iii

“Sustaining” innovations are defined by offering better performance in mainstream markets. For example, in 1976 radiographs (x-rays) cost upwards of US$200K and were sold by fierce global competitors such as General Electric, Siemens, Philips, Hitachi and Toshiba. In the struggle for market share, each competitor routinely came out with ingenious sustaining innovations year after year. Higher image resolutions, 3D and finer scanning sensitivities were delivered as x-ray technologies evolved into what is now computed tomography (CAT-scans).

Through such sustaining innovations General Electric, Siemens, Philips, and others could justify more than $400K for scanners in the mid-1980s, and upwards of $1 million for new scanners today. The technology got better, which is why prices remained high.

Like radiography, HPC has been prone to sustaining innovation. This is because, almost by definition, HPC exists to push the bleeding edge of ever-higher performance. For example, in 1984 it cost around $15 million for a Cray X-MP/48 with a theoretical peak performance of over 800 MFLOPS. Twenty-three years of breathtaking HPC innovation later, IBM’s Roadrunner broke the petaFLOPS barrier with systems costing upwards of $50-$60 million. While several price-per-performance ratios improved, the sticker shock increased more than three-to-four-fold and systems grew increasingly complex; daunting hurdles for many would-be customers.

Disruptive Innovation = Lower Cost x Higher Access

Unlike “sustaining” innovations, “disruptive” innovations tend to bring costs down. Rather than delivering higher performance to mainstream markets, disruptive innovations often target lower cost solutions that are “good enough” for a sub-segment of the market, but lower performing than the alternatives. They can also be solutions designed to specifically allow the less skilled or less wealthy to do what was previously done only by those with greater skill or resources.

For example, Complete Genomics is a service that sequences human genomes for medical research. The first-ever human genome took around 13 years and $3 billion to sequence in 2003, whereas today Complete Genomics offers highly automated, HPC-driven sequencing for only $5K per genome. While Complete Genomics offers a relatively limited set of analyses compared with more sophisticated wet labs, it is allowing a large population of less sophisticated medical researchers to begin analyzing genomic data in a manner that was previously beyond their means and expertise. HPC is making genomics more accessible and lowering costs.

Similarly, Linux and other open source tools have long had a disruptive impact on HPC. While not historically as high performance as incumbent solutions (such as proprietary Unix-based systems), the openness, accessibility and dramatically lower cost of Linux allowed it to enter from below and gradually increase performance over time. By doing so it has not only taken share from existing competitors, but it has also expanded the overall market for high-end systems. Lower cost, more access, more industry growth.

Past ≠ Future

This is not to say that sustaining innovation isn’t important. For the record, it is important. It is also not to say that HPC is never disruptive – the Complete Genomics and Linux examples show HPC can be disruptive (both in terms of end-user applications and the core HPC technology itself). Rather, the message here is that different types of innovation tend to produce different effects on firm and industry growth. One has a tendency to be exclusive, whereas the other tends to be inclusive.

Neither sustaining nor disruptive innovation are easy. Yet the HPC industry is nothing if not outstanding at the sustaining type; after all, on average HPC performance increases by two orders of magnitude every decade. This is not the primary holdup. Rather, perhaps it is time to ask how HPC can become more disruptive.

How can HPC tools more deliberately enable lower cost and greater access solutions?

How can HPC allow large populations of less skilled customers with fewer resources to begin doing what was previously only done by those with greater skill and wealth?

These are the new design challenges that HPC firms, the Titans of sustaining innovation, must wrap their heads around if they are serious about growth. It’s time to re-balance the equation.

About the author: Thomas Thurston is a renowned thought leader in specialized bodies of corporate strategy and investment methodology. He is currently President and Managing Director of Growth Science International.


i Wu & Conway, et al, IDC HPC Market Update, IDC Technical Computing Group (2009); IDC, Assessing the Commercial HPC Market, Department of Defense Report (1999).
ii Wu & Conway, et al, IDC HPC Market Update, IDC Technical Computing Group (2009); IDC, Assessing the Commercial HPC Market, Department of Defense Report (1999).
iii The Innovator’s Prescription; A Disruptive Solution for Health Care, McGraw Hill (2009).
iv Wu & Conway, et al, IDC HPC Market Update, IDC Technical Computing Group (2009).

Comments

  1. Except for the graphics, this article is badly written and it did not explain the point so clearly.

  2. Blue is a jerk. I thought this was very thought provoking.

  3. What is the definition of HPC? It seems like you can’t really talk about “growth in HPC” without it.

    If HPC is the top 5% of computing capability (or similar time-relative metric), then it would seem the question of “why isn’t HPC growing?” is answered before we start. Given the reference to 1984/Cray XMP vs. 2010/IBM FLOPS ratings, it seems like this is the kind of metric being used.

    If you look at HPC as some fixed capability (some kind of metric that doesn’t change each year) — e.g. the ability to simulate a car crash — then costs have dropped, many areas of “access” have improved and HPC has grown dramatically. (Though I’d agree that “access” in the sense of skills/knowledge still has a way to go)

  4. How about calculating the inflation between 1975 and today in?

  5. I think this article presents a lot to think about. More companies that I run into – in HPC – are starting to pay attention to disruptive vs. sustaining business strategy. This is a nice break from the typical tech articles.

  6. Agreed! Good article, want to see more.

  7. Excellent article, Thomas. There are a few nits that could be picked with respect to the methodology (for example, 2009 was an abnormally down year due to lengthened sales cycles, and we expect it to rebound in 2011), but your overall conclusion that we are missing opportunities in HPC is sound.

    In a study we recently completed with the National Center of Manufacturing Sciences, we found a huge disconnect between large and small companies’ use of HPC among U.S. manufacturers — one of the most mature HPC-using industries. The vast majority of U.S. manufacturers have fewer than 100 employees; fewer than 10% of those use any HPC. (Executive summary at: http://www.intersect360.com/industry/research.php?id=35.) This statistic is representative of the missed opportunity in bringing modeling and simulation capabilities to the volume market. What we’ve found is that the popular term “missing middle” is a misnomer. For the commercial sector especially, it’s really everything but the top that is missing.