Sign up for our newsletter and get the latest HPC news and analysis.

Interview: Addison Snell of Intersect360 Talks HPC, Cloud and Big Data

Intersect360, a leading research firm for HPC, recently released its annual Budget Map Report covering the trends in spending in supercomputing. We caught up with Addison Snell, CEO of the publication, to learn what was discovered in the latest edition.

insideHPC: Addison, your publication recently released a report entitled, “Intersect360 Research HPC Budget Allocation Map: Industry Averages.” What was the nature of this piece?

Addison Snell

Addison Snell: Our annual “Budget Map” report series looks at the relative spending between all of the products, components, and services that make up the HPC market. With six years of end user data, we get a strong grip on where the money is flowing, whether it’s on big items like clusters and storage, or on topical things like power consumption, programming, or compute cycles in public cloud. We also get a sense of future budget outlook and how the market is likely to evolve.

insideHPC: What were some of the major takeaways from the report?

Addison Snell: At the top level, hardware spending came down as a percent of total budgets, after a particularly high year last year. We see this as normal regression to the mean; after a few slow years during the tech recession, hardware spending had a catch-up year in 2012, then leveled off in 2013.

Another big takeaway is that commercial markets will continue to drive HPC growth for the next few years, as government and academic budgets worldwide continue to be constrained, though we’re seeing some signs of that loosening up.

Finally, a major area of focus is cloud computing. We did see a slight uptick in this year’s Budget Map, but it’s hard to call it a trend. Cloud is still a minor portion of budgets overall, and a lot of it goes to private clouds. We’re going to consider all our data as we revisit our cloud HPC forecast this quarter.

insideHPC: The HPC world is often seen as a niche market for the mad scientist. How can it go more main stream especially for the enterprise?

Addison Snell: HPC is already going mainstream for the enterprise, but it’s not always called HPC. Big Data is a major trend that’s driving the need for scalable performance in many enterprises, large and small, worldwide, that never would consider themselves HPC users, but now they’re looking at larger clusters with more throughput, with more in-house programming and wall-time metrics. Not all Big Data is HPC, but there’s certainly a significant overlap, and it’s a major growth driver.

On the more traditional HPC side, I think it’s a question of connecting HPC, which is a tool, to the value that the tool provides. No one is going to buy HPC just for the sake of having it. There has to be a business case for it. In manufacturing, for example, the number-one metric you can sell a new user on is product quality. If you can show how the use of HPC leads to a higher-quality product, there are big segments of users that will invest in it.

insideHPC: Big Data is certainly all the buzz now in the technology world. How can data analytics benefit from the technology that comes from super computing?

Addison Snell: First of all, there are many different faces of Big Data. If we look at Big Data not as a single application but as an industry trend — the growth in the creation and accessibility of data creating organizational challenges in managing or gaining competitive advantage from that data — we see that the challenges are myriad. The challenge of ingesting a 100TB file is different from the challenge of managing 100 million small files, and the challenge of managing data that is useful only for fractions of a second is different from the challenge of managing data that must remain accessible for 100 years or more.

For any of these challenges, there are technologies that have existed in HPC for years that might have applicability. Building scalable I/O solutions can involve flash, surely, but also technologies like InfiniBand or parallel file systems. There is applicability in middleware, in shared memory, in programming models; all of it should be in consideration. But I stop short of saying that HPC has solved all these problems before. That would fail to appreciate that new applications, new problems, are everywhere. The question is, how can we apply these technologies in ways that make sense. That’s the focus of our surveys on the Big Data opportunities for HPC.

insideHPC: How can cloud computing play a role in bringing HPC to a wider audience?

Addison Snell: One of the use cases for cloud computing in HPC is bringing in the true new user that doesn’t already have any HPC infrastructure. Cloud can let them try HPC — potentially very scalable HPC — at lower risk. The challenge is, these users also tend not to have HPC expertise, or the models that they would feed into these systems. Giving them access to HPC resources is necessary but not sufficient. There are a lot of other aspects to efficient HPC usage beyond, “Here’s a system you can rent.”

insideHPC: What is down the road for HPC? What will be the future applications from this field?

Addison Snell: That’s a fun question, with an answer that could go on for days without listing them all. That’s the great thing about this field. There’s no end to engineering achievement, no end to scientific discovery. In every field where HPC is deployed, researchers are chasing the horizon of what is possible — treating diseases, finding new energy sources, making minivans safer, finding other Earth-like planets, or adding shine and body to your hair. It doesn’t end, and there’s always innovation from top to bottom.

That said, if I have a personal favorite I’m interested in, it’s the work going on in brain research right now. 15 years ago, we mapped the human genome. That achievement didn’t end a field of research; it began one. We saw an explosion in what could be done through the expansive field of genomics. Similarly, I’m looking forward to the time, soon I think, when researchers will create a comprehensive map of the human brain. Imagine the understanding we could bring to topics like intelligence, memory, aging, and the neuropathy of disorders from autism to Alzheimer’s disease. When you think of the discoveries HPC can help enable, it’s truly amazing.

Resource Links: