Over at the HPC Source online magazine, Rob Farber writes that pending HPC technologies like the Hybrid Memory Cube could drastically change how we approach Big Data analytics.
“In 2014, HPC technology will combine original, un-fragmented data loads with visualization tools to map data—accelerating the creation of relational hypothesis. We will also start to see the first signs of machine learning for enterprise data insights as automated relational analysis enters the Big Data analytics space.”
New innovations from Adaptive Computing include: Moab Task Manager, a localized decision-making tool within Moab’s HPC Suite that enables high-speed throughput on short computing jobs. Adaptive has also announced a partnership with Intel to integrate Moab/TORQUE workload management software with the Intel HPC Distribution for Apache Hadoop software, which combines the Intel Distribution for Apache Hadoop software with the Intel Enterprise Edition of Lustre software.
As science drives a rapidly growing need for storage, existing environments face increasing pressure to expand capabilities while controlling costs. Many researchers, scientists and engineers find that they are outgrowing their current system, but fear their organizations may be too small to cover the cost and support needed for more storage. Join these experts for a lively discussion on how you can take control and solve the HPC data deluge.
In this video from the IDC Breakfast Briefing at ISC’13, Steve Conway presents: IDC’s Perspective on Big Data Outside of HPC. View the slides or check out more talks from the show at our ISC’13 Video Gallery.