In this Intel Chip Chat podcast, Bill Mannel from HP stops by to discuss the growing demand for high performance computing solutions and the innovative use of HPC to manage big data. He highlights an alliance between Intel and HP that will accelerate HPC and big data solutions tailored to meet the latest needs and workloads of HPC customers, leading with customized vertical solutions.
“These new flash products greatly expand the range of our total product portfolio and demonstrate how Seagate’s acquisition of the LSI flash technologies is paying off. The Nytro XF1440/XM1440 SSDs deliver the highest performance in the smallest power envelope. The XP6500 flash accelerator card provides ultra-low latency capability for applications that require fast logging and produce significantly higher transactions per second,–something today’s applications demand.”
“IBM Platform Data Manager for LSF takes control of data transfers to help organizations improve data throughput and lower costs by minimizing wasted compute cycles and conserving disk space. Platform Data Manager automates the transfer of data used by application workloads running on IBM Platform LSF clusters and the cloud, bringing frequently used data closer to compute resources by storing it in a smart, managed cache that can be shared among users and workloads.”
In this video, the Radio Free HPC team looks at the newly announced 3D XPoint technology from Intel and Micron. “3D XPoint ushers in a new class of non-volatile memory that significantly reduces latencies, allowing much more data to be stored close to the processor and accessed at speeds previously impossible for non-volatile storage.”
“As data explodes in volume, velocity and variety, and the processing requirements to address business challenges become more sophisticated, the line between traditional and high performance computing is blurring,” said Bill Mannel, vice president and general manager, HPC and Big Data, HP Servers. “With this alliance, we are giving customers access to the technologies and solutions as well as the intellectual property, portfolio services and engineering support needed to evolve their compute infrastructure to capitalize on a data driven environment.”
In this podcast, the Radio Free HPC team looks at how the KatRisk startup is using GPUs on the Titan supercomputer to calculate global flood maps. “KatRisk develops event-based probabilistic models to quantify portfolio aggregate losses and exceeding probability curves. Their goal is to develop models that fully correlate all sources of flood loss including explicit consideration of tropical cyclone rainfall and storm surge.”
“For more than a decade, DDN’s innovative, proven technology has been at the forefront of solving big data storage challenges in the world’s largest and most demanding environments,” said Molly Rector, CMO and EVP of Product Management at DDN. “Today, our vision and roadmap is being increasingly influenced by our enterprise and commercial HPC customers who are demanding new features in addition to the highest availability, reliability, scalability and best price/performance.”