How can startups succeed in HPC? Disruptive modeling says to avoid high performance products

Print Friendly, PDF & Email

What if there was a way to model the probability of your company’s success – a methodology that would predict success or failure – with amazing accuracy?

Within just the next few years, we’re likely to have a new standard that will be used by many investors to help them with due diligence before committing their money to new technology startups.

In this exclusive insideHPC interview, Mike Bernhardt talks with Thomas Thurston, president and managing director of Growth Science International, a prediction research organization that specializes in disruptive modeling using high accuracy tools to predict the likely success or failure of businesses.

insideHPC: What is “Disruptive Modeling,” and why should HPC companies care about it?

Thurston: Disruptive modeling is a surprising way to predict if a business is likely to survive or fail. It grew out of research pioneered over a decade ago by Professor Clayton Christensen at Harvard. Building on this research a few years ago at Intel, and then at Harvard with Christensen, I had some new breakthroughs that have not yet been published. VCs, investment firms, large companies and others now work with my firm to guide their investments. Through these breakthroughs we’ve predicted if businesses will live or die with more than 80% accuracy. It’s uncanny. It also turns out that HPC is one of the most enigmatic sectors when viewed through the models.

insideHPC: 80% accuracy is hard to believe. Venture capitalists tend to be right 10-20% of the time when they bet on a business. Could your results be luck?

Thurston: That’s the right question to ask, and it’s the first thing professors at MIT brought up at the beginning of my research back in 2006 – 2007. Since, on average, about 80% – 90% of all businesses fail, we would have been 80% – 90% accurate if we’d just predicted that all businesses will fail. Long story short, we used the right statistical rigor to make sure our results weren’t dumb luck. They were statistically significant with more than 99% confidence. Still, we keep challenging the models through some pretty rigorous scrutiny and so far, so good. No, this is not just luck.

insideHPC: Can you illustrate how these models work, using examples from HPC?

Thurston: Sure. Every year it seems like a new crop of HPC startups promise higher performance than what’s already on the market. Almost by definition, HPC startups typically shoot for high(er) performance. Yet surprisingly, this combination of being a “new entrant” with a “better performance” solution creates an extremely low empirical probability of success for many companies.

Here’s an example that rings home with most folks in the HPC community. Think of SiCortex. It seems like just yesterday SiCortex was making a splash with its hardware and virtualization technology, not to mention the visual appeal of the boxes. When I first looked at it back in 2007 I have to say – I was impressed. Yet our models predicted that SiCortex would fail regardless, and that’s exactly what happened. From another perspective, think of all the new entrants who tried to beat industry storage leaders like EMC in the performance game. Most of those new entrants perished, with Entrada Networks being a good example from the 1990s. The models suggest that instead of shooting for higher performance, companies like SiCortex and Entrada Networks would have been far more likely to succeed if they’d just chosen any one, of three, specific go-to-market strategies instead. It’s a shame the models weren’t more available then.

insideHPC: If HPC startups shouldn’t shoot for high performance, what should they do?

Thurston: That’s where the methodology and our analysis really adds value. It’s going to be somewhat different for each company. But, it turns out that HPC startups can pursue strategies other than just “more performance” with surprisingly higher empirical probabilities of success. This is what most people don’t realize. For example, while it’s not the only strategy that works, in prescribed circumstances the highest probability approach is for a new company to start with the lowest performing, lowest cost solution in the market. In other words, even if HPC is your end goal, the fastest path can be to start at the low end, not the high end. This defies most thinking.

Back to storage, rather than offering better performance, NetApp entered the market with low cost, low-performing NAS boxes and has done tremendously well. They came in at the low end and kept improving every year, taking market share from SANs one small bite at a time for over a decade. If, rather than starting at the low end, NetApp had taken on EMC at the high end day one, EMC probably would have crushed them just like Entrada Networks. Yet instead, by some estimates NetApp’s revenue last year was more than roughly 20% the size of the entire HPC industry. That’s the kind of difference the right models can make.

insideHPC: What if a new technology is radically better? At some point do the odds change?

Thurston: Don’t get me wrong, in certain circumstances higher performing technology can be quite likely to succeed. Industry incumbents successfully launch higher performing technology all the time. Disruption models are circumstance based, so in some circumstances – where you are a significant incumbent – better technology is often the right move.

However, things change in the circumstance if a company is a startup or new entrant to an industry. As a new entrant, even if you’re 100x faster and 50% lower cost, you’re statistically doomed. Some people have a hard time accepting empirically-based strategy because it can run against intuition. Intuition is good, but it’s risky to ignore data. It pays to know what has or hasn’t worked across large data sets before jumping into a market, especially HPC.

insideHPC: What about GPUs? Did the models predict the growing fervor around GPUs?

Thurston: At Intel years ago, we had some conversations with the group who founded Larrabee. The models predicted that GPUs would start disrupting CPUs, and it was a pleasant surprise to learn the Larrabee team had already considered this. While it’s hard to say exactly what prompted Larrabee, I think this notion of disruption most likely had a lot to do with it; whether that’s the exact vocabulary they were using or not at the time. History will show if Larrabee is a success or not, but there’s no question that GPUs are moving up-market in a disruptive direction toward the heart of Intel’s enterprise platforms, not to mention low-end platforms too.

insideHPC: What do you mean? Could GPUs disrupt CPUs in the consumer segment as well?

Thurston: The threat is definitely there. ARM architectures have been on a steady path to disrupt x86 CPUs for over a decade. Starting at the low-end with cell phones, ARM-based CPUs have somewhat quietly moved up-market into smartphones, and now netbooks. As they continue to improve in line with Moore’s Law, they are settling into 2 GHz while remaining far lower power and lower cost than even Atom. The tipping point is likely to begin in earnest during 2010.

Nvidia’s Ion platform foreshadows where GPUs may come into play at the low end. Last year Nvidia created a reference design that paired the GeForce 9400 with an Atom CPU. It was a 2-chip system since the northbridge and southbridge were combined into the GPU. In that design you don’t need a very sophisticated CPU to get some pretty compelling performance, since most of the processing is done by the GPU. If Atom plus a GPU can give great performance, how long do you think it will be before a much lower cost ARM core can be paired with an Nvidia GPU to do 95% of what the consumer market needs? That’s one path to a $100 notebook with days of battery life instead of hours. It will be interesting to see how Intel responds.

You have to remember that Intel may not be the presumptive ‘Big Dog’ in the battle with ARM. Roughly 20X more ARM chips shipped last year than IA. Intel’s R&D also has to compete with hundreds of ARM licensees, most of which are fabless, giving cost advantages and flexibility.

insideHPC: Are you saying Intel is facing possible disruption at the low end, and the high end, at the same time?

Thurston: That’s exactly right. But that does not imply that Intel isn’t very well aware of this and already developing strategy to deal with the situation. Intel has survived disruption before – like when AMD and Cyrix were attacking from below in the 1990s. By recognizing what was happening, with Christensen’s help I should add, Intel responded with the low-cost, low-performance Celeron. AMD was slowed down and Cyrix exited the market. It probably saved Intel for a generation. However, Intel can’t simply copy what they did with the Celeron in the face of today’s GPUs and ARM. I for one am certainly not counting Intel out. They are far more innovative than people often give them credit for. I’m just saying there will have to be a new playbook. It’s going to be a wild ride.

insideHPC: Can a company build an effective product strategy with a GPU-only system?

Thurston: Yes, but there’s a significant challenge. Most of the new entrant GPU-based solution vendors I see popping up are aiming for higher performance than other GPU or CPU-based systems. As we discussed earlier, this is especially the case in HPC where, by definition, people are going for the high-end. The research suggests most of these new entrants are doomed unless they have a methodology to help them think through the right strategic questions. The implications are tremendous for the software community as well.

insideHPC: What advice do you have for anyone entering HPC?

Thurston: Obviously my research gives me a bias, but to steal a line from Michael Mauboussin, “the plural of anecdote is not evidence.” Experience and intuition are the main tools most of us use to develop HPC strategy. Yet, while experience and intuition are invaluable, they are not the same as data. It turns out that, at least empirically, the best decisions are often counterintuitive and defy conventional experience. Whether using disruptive models or other ones, HPC companies need to take market precedent to heart. It’s not enough to just to say “the odds of success are poor… but not for me.” Disruptive modeling is not new – but we have made tremendous strides in this area in the past few years and have taken it where it had never gone before. Many companies – and investors – would be well advised to take advantage of this predictive analysis tool.

Thomas Thurston is President and Managing Director of Growth Sciences International based in Beaverton, Oregon. Readers can contact the company at info@growthsci.com.

Trackbacks

  1. […] in October, insideHPC ran an interview with Thomas Thurston, president of Growth Science International, on the topic of disruptive […]

  2. […] by Thomas Thurston, President and Managing Director of Growth Sciences International. We last heard from Thomas in October of […]

  3. […] my October interview with InsideHPC last year, several readers have asked for elaboration around the assertion that ARM […]

  4. […] has done guest posts for insideHPC in the past, and I’m delighted that our readers will now be able to hear his […]

Comments

  1. I saw Thomas present some of this research at an innovation summit in Atlanta about a year and a half ago. Really tremendous stuff! It was probably the most useful presentation I’ve ever seen, and that’s saying a LOT. He’s quantified some things that I never believed could be quantified, and so far everything he predicted looks like it’s coming true. Glad to see it’s finding its way into HPC.

  2. As an owner of a small business involved in HPC, I can certainly understand Thurston’s perspectives, and there is great insight into his research. The issues involved with HPC are more complex than merely increased performance. The critical issue in survival ultimately is value – both the reality and perception – in a company’s product portfolio. Specifically, how do the products contribute to accomplishment of the important and necessary work by our clients.

    Disruptive technologies are a pain for our clients – unless the adoption of the technology is SO important that it is obviously worth the trouble. The reality of that dynamic is something I have to test every day.

  3. Safwan Zaheer says

    Great insights from Thomas. I met him at HBS, while a student at MIT, and have known him for the past ~2 yrs. I would state with no hesitation that he is an authority on Disruption. I’ve also had the pleasure of working with him on a few projects and often consult him on disruption.

    Disruption is happening everywhere; the challenge is to detect it and take the appropriate strategic action to confront it (if it is working against you). The good news is that there are models that help predict outcomes of success and failures. Unfortunately, the investor community has yet to embrace a data-driven, empirical based approach to making investments decision. Thomas and I jointly looked into a start-up that had the potential to “disrupt” the tradiional chip manufacturers only if the VCs weren’t pressuring the start-up to do the opposite thing. The problem also is that investors are “impatient for profits but patient for growth” yet disruption theory suggests the exact opposite (counter to intuition) !

    Thomas – keep up the good work and many thanks to Mike for highlighting this important tool. Hopefully, the investor community will start to embrace such models sooner rather than later.

  4. I generally agree with Safwan Zaheer, but I’ve heard first-hand that several VCs and hedge fund-types are starting to use these disruption models now. It seems to be coming up a lot all of a sudden, at least here in the Bay. I think some VCs are scared because the models might be better at picking startups than they are. That’s just going to be survival of the fittest though if those guys don’t get the same returns that investors using disruption modeling get. My colleague worked with Growth Science and said it was tremendously eye opening (in a good way), but it made him realize that he’d bet on a couple of the wrong portcos.

    A lot of VCs won’t admit that and will ride a dead horse right off the cliff. Same with a lot of startups; they’ll stick with a dead strategy no matter what. I see this in HPC all the time. It takes a lot of gusto to change course when you’ve committed a lot of resources in the wrong direction. I think anyone who ignores disruptive modeling does so at their own peril.

  5. Virgilia Singh says

    I have had the pleasure of working directly with Thomas and must say that all of the previous comments don’t do justice to the innovative way he approaching each project. In order for Venture Capitalists to truly succeed in this 2.0 era, they must acknowledge that disruption exists and tackle it head on.

  6. Mike Redmen says