Panasas Chief Scientist on Where HPC Meets Big Data and Hadoop

Print Friendly, PDF & Email


In this video, Panasas Chief Scientist Garth Gibson describes how the company brings Hadoop support to bear in the world of Big Data and HPC.

Hadoop is a great platform for taking a gigantic amount of information and reducing it down to the central core that you then want to do the second level of analysis on. And that’s what’s happening across the enterprise, data warehousing, and HPC. So the fundamental issue is that after you’re done crunching with that commodity Hadoop cluster, you now have valuable assets. You want those valuable assets on a system you trust. You want it on a good, high-quality NAS. But it has to keep up. You need the high speed of a direct-flow environment. And then, it turns out, that once you can process from off-board quickly, you can optimize Hadoop and go faster in many cases because you’re using your off-board NAS.”

Recorded at SC12 in Salt Lake City. Read the Full Story.