What if there was a way to model the probability of your company’s success – a methodology that would predict success or failure – with amazing accuracy?
Within just the next few years, we’re likely to have a new standard that will be used by many investors to help them with due diligence before committing their money to new technology startups.
In this exclusive insideHPC interview, Mike Bernhardt talks with Thomas Thurston, president and managing director of Growth Science International, a prediction research organization that specializes in disruptive modeling using high accuracy tools to predict the likely success or failure of businesses.
insideHPC: What is “Disruptive Modeling,” and why should HPC companies care about it?
Thurston: Disruptive modeling is a surprising way to predict if a business is likely to survive or fail. It grew out of research pioneered over a decade ago by Professor Clayton Christensen at Harvard. Building on this research a few years ago at Intel, and then at Harvard with Christensen, I had some new breakthroughs that have not yet been published. VCs, investment firms, large companies and others now work with my firm to guide their investments. Through these breakthroughs we’ve predicted if businesses will live or die with more than 80% accuracy. It’s uncanny. It also turns out that HPC is one of the most enigmatic sectors when viewed through the models.
insideHPC: 80% accuracy is hard to believe. Venture capitalists tend to be right 10-20% of the time when they bet on a business. Could your results be luck?
Thurston: That’s the right question to ask, and it’s the first thing professors at MIT brought up at the beginning of my research back in 2006 – 2007. Since, on average, about 80% – 90% of all businesses fail, we would have been 80% – 90% accurate if we’d just predicted that all businesses will fail. Long story short, we used the right statistical rigor to make sure our results weren’t dumb luck. They were statistically significant with more than 99% confidence. Still, we keep challenging the models through some pretty rigorous scrutiny and so far, so good. No, this is not just luck.
insideHPC: Can you illustrate how these models work, using examples from HPC?
Thurston: Sure. Every year it seems like a new crop of HPC startups promise higher performance than what’s already on the market. Almost by definition, HPC startups typically shoot for high(er) performance. Yet surprisingly, this combination of being a “new entrant” with a “better performance” solution creates an extremely low empirical probability of success for many companies.
Here’s an example that rings home with most folks in the HPC community. Think of SiCortex. It seems like just yesterday SiCortex was making a splash with its hardware and virtualization technology, not to mention the visual appeal of the boxes. When I first looked at it back in 2007 I have to say – I was impressed. Yet our models predicted that SiCortex would fail regardless, and that’s exactly what happened. From another perspective, think of all the new entrants who tried to beat industry storage leaders like EMC in the performance game. Most of those new entrants perished, with Entrada Networks being a good example from the 1990s. The models suggest that instead of shooting for higher performance, companies like SiCortex and Entrada Networks would have been far more likely to succeed if they’d just chosen any one, of three, specific go-to-market strategies instead. It’s a shame the models weren’t more available then.
insideHPC: If HPC startups shouldn’t shoot for high performance, what should they do?
Thurston: That’s where the methodology and our analysis really adds value. It’s going to be somewhat different for each company. But, it turns out that HPC startups can pursue strategies other than just “more performance” with surprisingly higher empirical probabilities of success. This is what most people don’t realize. For example, while it’s not the only strategy that works, in prescribed circumstances the highest probability approach is for a new company to start with the lowest performing, lowest cost solution in the market. In other words, even if HPC is your end goal, the fastest path can be to start at the low end, not the high end. This defies most thinking.
Back to storage, rather than offering better performance, NetApp entered the market with low cost, low-performing NAS boxes and has done tremendously well. They came in at the low end and kept improving every year, taking market share from SANs one small bite at a time for over a decade. If, rather than starting at the low end, NetApp had taken on EMC at the high end day one, EMC probably would have crushed them just like Entrada Networks. Yet instead, by some estimates NetApp’s revenue last year was more than roughly 20% the size of the entire HPC industry. That’s the kind of difference the right models can make.
insideHPC: What if a new technology is radically better? At some point do the odds change?
Thurston: Don’t get me wrong, in certain circumstances higher performing technology can be quite likely to succeed. Industry incumbents successfully launch higher performing technology all the time. Disruption models are circumstance based, so in some circumstances – where you are a significant incumbent – better technology is often the right move.
However, things change in the circumstance if a company is a startup or new entrant to an industry. As a new entrant, even if you’re 100x faster and 50% lower cost, you’re statistically doomed. Some people have a hard time accepting empirically-based strategy because it can run against intuition. Intuition is good, but it’s risky to ignore data. It pays to know what has or hasn’t worked across large data sets before jumping into a market, especially HPC.
insideHPC: What about GPUs? Did the models predict the growing fervor around GPUs?
Thurston: At Intel years ago, we had some conversations with the group who founded Larrabee. The models predicted that GPUs would start disrupting CPUs, and it was a pleasant surprise to learn the Larrabee team had already considered this. While it’s hard to say exactly what prompted Larrabee, I think this notion of disruption most likely had a lot to do with it; whether that’s the exact vocabulary they were using or not at the time. History will show if Larrabee is a success or not, but there’s no question that GPUs are moving up-market in a disruptive direction toward the heart of Intel’s enterprise platforms, not to mention low-end platforms too.
insideHPC: What do you mean? Could GPUs disrupt CPUs in the consumer segment as well?
Thurston: The threat is definitely there. ARM architectures have been on a steady path to disrupt x86 CPUs for over a decade. Starting at the low-end with cell phones, ARM-based CPUs have somewhat quietly moved up-market into smartphones, and now netbooks. As they continue to improve in line with Moore’s Law, they are settling into 2 GHz while remaining far lower power and lower cost than even Atom. The tipping point is likely to begin in earnest during 2010.
Nvidia’s Ion platform foreshadows where GPUs may come into play at the low end. Last year Nvidia created a reference design that paired the GeForce 9400 with an Atom CPU. It was a 2-chip system since the northbridge and southbridge were combined into the GPU. In that design you don’t need a very sophisticated CPU to get some pretty compelling performance, since most of the processing is done by the GPU. If Atom plus a GPU can give great performance, how long do you think it will be before a much lower cost ARM core can be paired with an Nvidia GPU to do 95% of what the consumer market needs? That’s one path to a $100 notebook with days of battery life instead of hours. It will be interesting to see how Intel responds.
You have to remember that Intel may not be the presumptive ‘Big Dog’ in the battle with ARM. Roughly 20X more ARM chips shipped last year than IA. Intel’s R&D also has to compete with hundreds of ARM licensees, most of which are fabless, giving cost advantages and flexibility.
insideHPC: Are you saying Intel is facing possible disruption at the low end, and the high end, at the same time?
Thurston: That’s exactly right. But that does not imply that Intel isn’t very well aware of this and already developing strategy to deal with the situation. Intel has survived disruption before – like when AMD and Cyrix were attacking from below in the 1990s. By recognizing what was happening, with Christensen’s help I should add, Intel responded with the low-cost, low-performance Celeron. AMD was slowed down and Cyrix exited the market. It probably saved Intel for a generation. However, Intel can’t simply copy what they did with the Celeron in the face of today’s GPUs and ARM. I for one am certainly not counting Intel out. They are far more innovative than people often give them credit for. I’m just saying there will have to be a new playbook. It’s going to be a wild ride.
insideHPC: Can a company build an effective product strategy with a GPU-only system?
Thurston: Yes, but there’s a significant challenge. Most of the new entrant GPU-based solution vendors I see popping up are aiming for higher performance than other GPU or CPU-based systems. As we discussed earlier, this is especially the case in HPC where, by definition, people are going for the high-end. The research suggests most of these new entrants are doomed unless they have a methodology to help them think through the right strategic questions. The implications are tremendous for the software community as well.
insideHPC: What advice do you have for anyone entering HPC?
Thurston: Obviously my research gives me a bias, but to steal a line from Michael Mauboussin, “the plural of anecdote is not evidence.” Experience and intuition are the main tools most of us use to develop HPC strategy. Yet, while experience and intuition are invaluable, they are not the same as data. It turns out that, at least empirically, the best decisions are often counterintuitive and defy conventional experience. Whether using disruptive models or other ones, HPC companies need to take market precedent to heart. It’s not enough to just to say “the odds of success are poor… but not for me.” Disruptive modeling is not new – but we have made tremendous strides in this area in the past few years and have taken it where it had never gone before. Many companies – and investors – would be well advised to take advantage of this predictive analysis tool.
Thomas Thurston is President and Managing Director of Growth Sciences International based in Beaverton, Oregon. Readers can contact the company at [email protected]