Stop the Lab Worship: Commercial HPC is Sexy, Too

Print Friendly, PDF & Email

There’s a trend in the data storage world to call out a growing “divide” between HPC and enterprise storage. We believe that’s an oversimplification. What’s worse, it reinforces popular myths coursing through the HPC landscape that revel in what we call “lab worship” at the expense of more mainstream enterprise customers.

The biggest myth is that the “leadership class” systems at the national labs, the ones pursuing exascale capabilities, constitutes the majority of the market. In fact, commercial HPC is the biggest segment. According to Intersect360, academic and government customers represent only 42 percent of the $39 billion HPC market. That leaves 58 percent for commercial HPC customers, including engineering, life sciences, manufacturing, energy, financial services, media, retail and transportation. And most of those academic and government customers are just trying to pursue their mission and get their research done, not expand the boundaries of computation.

The vast majority of HPC customers have terascale to low petascale storage needs. They still need high performance, obviously, but also need reliability and manageability out of the box. And they need to offer users the ability to efficiently (not manually) support mixed workloads. What they don’t need is do-it-yourself open-source toolkits, risking weeks of downtime. And they can’t just “absorb” the need for highly skilled administrators, down-time for fine tuning, and other hidden costs of operation seemingly built into the infrastructure at the national labs, government, and other subsidized entities.

What they need is a highly reliable, low maintenance system that moves lots of data quickly, and instantly adapts to diverse and changing file sizes and workflows. In other words, a sophisticated high-performance tool to get the job done, not a one-off piece of art.

The work that’s being done by commercial HPC customers is incredibly exciting and innovative. Dare we say sexy? For example, life science customers are using commercial HPC gear to discover new life-saving treatments, and energy companies are using commercial HPC to find new sources of power. Commercial HPC deployments increasingly feature machine learning and artificial intelligence techniques as enterprise customers seek to process, analyze and act on today’s rapid influx of new, large data sets.

We see two fundamental shifts happening across verticals driven by the need to efficiently support mixed workloads without manual effort: the need to manage operational costs, not just acquisition costs, and the need to ensure researchers can do their work, without downtime or distractions.

This is why Panasas has made a focused investment in advancing PanFS, our parallel file system, so we can deliver more power and more value to HPC customers. Rather than be the “limiting factor” for exascale-level HPC, Panasas believes file systems have great potential to be the accelerating factor for commercial HPC.

Commercial enterprises are faced with massive opportunities for new discovery and innovation, and they are smartly leveraging HPC and AI to meet that challenge.  The exciting, fast growing market for commercial HPC products is playing a critical role in their success.