Sign up for our newsletter and get the latest HPC news and analysis.

Alex Bouzari Presents: DDN Vision

ddn

“By developing technology which solves the 21st century challenges of massive data creation and complex information analytics — what many are calling the “Big Data Era,” and leveraging an extensive network of go-to-market partners which includes IBM, HP, Dell, and Sony, DDN has successfully deployed thousands of systems in enterprises, universities and government agencies worldwide.”

Site-wide Storage Use Case and Early User Experience with Infinite Memory Engine

tommy

“IME unleashes a new I/O provisioning paradigm. This breakthrough, software defined storage application introduces a whole new new tier of transparent, extendable, non-volatile memory (NVM), that provides game-changing latency reduction and greater bandwidth and IOPS performance for the next generation of performance hungry scientific, analytic and big data applications – all while offering significantly greater economic and operational efficiency than today’s traditional disk-based and all flash array storage approaches that are currently used to scale performance.”

Video: Industrial Supercomputing, Why Do We Care?

Merle Giles

“NCSA has worked with more than one-third of the Fortune50, in sectors including manufacturing, oil and gas, finance, retail/wholesale, bio/medical, life sciences, astronomy, agriculture, technology, and more. NCSA’s Private Sector Program currently boasts 26 partners. PSP’s core mission is to help its partner community gain a competitive edge through expert use of modern, high-performance digital and human resources.”

Satoshi Matsuoka Presents: A Look at Big Data in HPC

satoshi

In this video from the DDN User Group at ISC’14, Satoshi Matsuoka from the Tokyo Institute of Technology presents: A Look at Big Data in HPC. “HPC has been dealing with big data for all of its existence. But it turns out that the recent commercial emphasis on big data, has coincided with a fundamental change in the sciences as well. As scientific instruments and facilities produce large amounts of data in an unprecedented rate, the HPC community is reacting to this, with revisiting architecture, tools, and services to address this growth in data.”