HPC Technologies to Fuel Big Money for Big Data

Print Friendly, PDF & Email

 

Over at the HPC Source online magazine, Rob Farber writes that pending HPC technologies like the Hybrid Memory Cube could drastically change how we approach Big Data analytics.

There is an adage in HPC that states, “real memory for real performance.” Large memory machines bypass a host of I/O bottlenecks because they can load all, or at least a significant portion, of a big data set into RAM (random access memory). Such machines justify their cost because RAM is so much faster than most I/O subsystems. Currently, users can purchase servers that contain a terabyte of system RAM. In the near future, hybrid memory cubes (HMCs) may make a terabyte of system RAM passé. With a purported ability to sustain a terabit/sec bandwidth (or an astounding 125 GB/s), hybrid memory cubes have the potential to greatly expand and accelerate computer memory subsystem performance. As I noted in my Scientific Computing article, “The GPU Performance Revolution,” HMCs have the potential to revolutionize massively parallel co-processors like GPUs and MIC. Similarly, HMCs can drastically ex- pand the capabilities of conventional workstations and servers.

Read the Full Story (flash-based) or Download the PDF.