Sign up for our newsletter and get the latest HPC news and analysis.

HPC Technologies to Fuel Big Money for Big Data

1RMI!

Over at the HPC Source online magazine, Rob Farber writes that pending HPC technologies like the Hybrid Memory Cube could drastically change how we approach Big Data analytics.

Big Data Predictions for 2014

jorge

“In 2014, HPC technology will combine original, un-fragmented data loads with visualization tools to map data—accelerating the creation of relational hypothesis. We will also start to see the first signs of machine learning for enterprise data insights as automated relational analysis enters the Big Data analytics space.”

New Moab Task Manager & Support for Intel HPC Distribution for Hadoop

robclyde

New innovations from Adaptive Computing include: Moab Task Manager, a localized decision-making tool within Moab’s HPC Suite that enables high-speed throughput on short computing jobs. Adaptive has also announced a partnership with Intel to integrate Moab/TORQUE workload management software with the Intel HPC Distribution for Apache Hadoop software, which combines the Intel Distribution for Apache Hadoop software with the Intel Enterprise Edition of Lustre software.

Scalable Informatics Pushes the Limits of IO for Big Data at SC13

scalable

In this video from SC13, Joe Landman from Scalable Informatics describes the company’s array of innovative storage devices for high performance data analysis.

Panel Discussion: Solving the HPC Data Deluge

bigdatapanel

As science drives a rapidly growing need for storage, existing environments face increasing pressure to expand capabilities while controlling costs. Many researchers, scientists and engineers find that they are outgrowing their current system, but fear their organizations may be too small to cover the cost and support needed for more storage. Join these experts for a lively discussion on how you can take control and solve the HPC data deluge.

Creating a Better Infrastructure to Manage Big Data

Trev Harmon

Over the course of this talk, Trev Harmon looks back to the utility computing vision of Douglas Parkhill and proposes an application-centric workflow for the future that fulfills that vision across many disciplines of computing.

Using PBS to Schedule MapReduce Jobs Accessing OrangeFS

Screen Shot 2013-10-04 at 6.58.51 AM

Using PBS Professoinal and a customized version of myHadoop has allowed researchers at Clemson University to submit their own Hadoop MapReduce jobs on the “Palmetto Cluster”. Now, researchers at Clemson can run their own dedicated Hadoop daemons in a PBS scheduled environment as needed.

Sponsored Post: Balancing High Performance with Cost when Analyzing Big Data

NetApp Logo small

HPC and big data professionals need high performance to ingest and analyze huge amounts of data, while still managing power and cost efficiently.

Slidecast: SSRLabs Develops Energy- and Instruction-Efficient HPC

SSRLabs_InsideHPC_August2013

Scalable Systems Research Labs is developing coprocessors to solve the “Big Data” problem by accelerating execution of applications.

IDC's Perspective on Big Data Outside of HPC

In this video from the IDC Breakfast Briefing at ISC’13, Steve Conway presents: IDC’s Perspective on Big Data Outside of HPC. View the slides or check out more talks from the show at our ISC’13 Video Gallery.