The DDN User Group meeting will take place July 14 at ISC 2014 in Frankfurt.
ICHEC is using DDN’s Infinite Memory Engine (IME) to accelerate Read and Write speeds in order to test ICHEC’s codes used by Tullow Oil, Africa’s leading independent oil company. In this talk, Browne shows how IME accelerated current workflows and reduced run time, opening the door to significantly faster resource discovery.
“By developing technology which solves the 21st century challenges of massive data creation and complex information analytics — what many are calling the “Big Data Era,” and leveraging an extensive network of go-to-market partners which includes IBM, HP, Dell, and Sony, DDN has successfully deployed thousands of systems in enterprises, universities and government agencies worldwide.”
“IME unleashes a new I/O provisioning paradigm. This breakthrough, software defined storage application introduces a whole new new tier of transparent, extendable, non-volatile memory (NVM), that provides game-changing latency reduction and greater bandwidth and IOPS performance for the next generation of performance hungry scientific, analytic and big data applications – all while offering significantly greater economic and operational efficiency than today’s traditional disk-based and all flash array storage approaches that are currently used to scale performance.”
“NCSA has worked with more than one-third of the Fortune50, in sectors including manufacturing, oil and gas, finance, retail/wholesale, bio/medical, life sciences, astronomy, agriculture, technology, and more. NCSA’s Private Sector Program currently boasts 26 partners. PSP’s core mission is to help its partner community gain a competitive edge through expert use of modern, high-performance digital and human resources.”
In this video from the DDN User Group at ISC’14, Satoshi Matsuoka from the Tokyo Institute of Technology presents: A Look at Big Data in HPC. “HPC has been dealing with big data for all of its existence. But it turns out that the recent commercial emphasis on big data, has coincided with a fundamental change in the sciences as well. As scientific instruments and facilities produce large amounts of data in an unprecedented rate, the HPC community is reacting to this, with revisiting architecture, tools, and services to address this growth in data.”