Sign up for our newsletter and get the latest HPC news and analysis.

New APAX HDF Library Accelerates HDF-Enabled Applications 3-8X

This week Samplify released its new APAX HDF (Hierarchical Data Format ) Storage Library for HPC, Big Data, and cloud computing applications.

HDF is an open source library which is the standard file format for the interchange of climate data. In its current release HDF5 version 1.8.11, the HDF Group adds support for third party plug-ins into the storage pipeline. Samplify’s APAX plug-in for HDF enables transparent access to APAX compression technology without requiring modification of the application software. When used in conjunction with Samplify’s APAX Profiler, the encoding rate can be optimized for each dataset being stored in the file.

Our engagements with government labs, academic institutions, and private data centers reveal a continuous struggle to manage an ever increasing amount of data,” says Al Wegener, Founder and CTO of Samplify. “We have been asked for a simpler way to integrate our APAX encoding technology in Big Data and cloud applications. By using plug-in technology for HDF, we enable any application that currently uses HDF as its storage format to get the benefits of improved disk throughput and reduced storage requirements afforded by APAX.”

At ISC’13 next week, Prof. Dr. Thomas Ludwig will describe experiences with the new Samplify technology in a session entitled: Combining HPC & Big Data in the Climate Modeling Workflow.

Read the Full Story.

Resource Links: