InsideHPC Guide to HPC Storage

HPC and technical computing environments require the collection, storage, and transmission of large-scale datasets. To meet these demands, datacenter architects must consider how increasing storage capacity over time will affect HPC workload, performance, and system availability.

Storage for Achieving Performance at Scale

Storage and data management have arguably become the most important HPC “pain points” already, with access densities a particularly troubling issue. Many HPC sites are doubling their storage capacities every two to three years, but adding capacity does not address the access density, data movement, and related storage issues many HPC buyers face. When this happens, your investments in processing, networking, middleware and applications are choked off by bottlenecks in your storage infrastructure. If you’re looking to maximize throughput of your technical computing infrastructure, storage performance often holds the key.

Slidecast: Fujifilm’s New Dternity Storage Technology for NAS

“Data has become the foundation upon which all smart business decisions are made. Archives have traditionally been inaccessible, offline and practically invisible. The new business culture demands more. The Dternity S unlocks the potential of your data, providing easy online access to all of your archived content.”

Avere Delivers Storage Solutions for the HPC Cloud Era

In this video from SC13, Jeff Tabor from Avere Systems describes the company’s optimized NAS storage solutions including the new Cloud NAS. “Avere has always been on the forefront of pushing the boundaries of file storage performance and efficiency. Avere Cloud NAS is a natural evolution of the product, advancing the adoption of cloud storage for serious business use cases.”