Sign up for our newsletter and get the latest HPC news and analysis.

The Time is Now for In-Memory Analytics


“In-memory database and analytics solutions enable significant performance gains in analyzing complex and diverse datasets. We’re talking about analysis in seconds or minutes rather than hours or days. This is how you get to real-time insight.”

Adaptive Computing Top Predictions for Big Data Computing


“The speed, accuracy and cost at which enterprises can process big data analytics is the new competitive battleground, and we expect the need for results to greatly impact computing in 2014 and beyond,” said Rob Clyde, CEO of Adaptive Computing. “In our estimation, big data requires a streamlined approach to a complex data analysis and simulation process that can manage all resources across multiple computing platforms.”

Programming GPUs Directly from Python Using NumbaPro


“NumbaPro is a powerful compiler that takes high-level Python code directly to the GPU producing fast-code that is the equivalent of programming in a lower-level language. It contains an implementation of CUDA Python as well as higher-level constructs that make it easy to map array-oriented code to the parallel architecture of the GPU.”

SGI to Develop SAP HANA In-Memory Computing System


Today SGI announced plans to develop an in-memory appliance based on the SAP HANA platform. According to SGI, the company’s UV in-memory computing system based on SAP HANA will be scalable to manage the growing computing needs associated with enterprise big data.

IBM Billion-dollar Division to Deliver Watson Through the Cloud


“Watson Discovery Advisor uses Watson’s cognitive intelligence to save researchers time, after reading through, determining context and synthesizing vast amounts of data. It helps users pinpoint connections within the data to accelerate their work. Connections are made that are often overlooked within the enormous volume of information available from relevant data sources.”

HPC Technologies to Fuel Big Money for Big Data


Over at the HPC Source online magazine, Rob Farber writes that pending HPC technologies like the Hybrid Memory Cube could drastically change how we approach Big Data analytics.

Big Data Predictions for 2014


“In 2014, HPC technology will combine original, un-fragmented data loads with visualization tools to map data—accelerating the creation of relational hypothesis. We will also start to see the first signs of machine learning for enterprise data insights as automated relational analysis enters the Big Data analytics space.”

New Moab Task Manager & Support for Intel HPC Distribution for Hadoop


New innovations from Adaptive Computing include: Moab Task Manager, a localized decision-making tool within Moab’s HPC Suite that enables high-speed throughput on short computing jobs. Adaptive has also announced a partnership with Intel to integrate Moab/TORQUE workload management software with the Intel HPC Distribution for Apache Hadoop software, which combines the Intel Distribution for Apache Hadoop software with the Intel Enterprise Edition of Lustre software.

Scalable Informatics Pushes the Limits of IO for Big Data at SC13


In this video from SC13, Joe Landman from Scalable Informatics describes the company’s array of innovative storage devices for high performance data analysis.

Panel Discussion: Solving the HPC Data Deluge


As science drives a rapidly growing need for storage, existing environments face increasing pressure to expand capabilities while controlling costs. Many researchers, scientists and engineers find that they are outgrowing their current system, but fear their organizations may be too small to cover the cost and support needed for more storage. Join these experts for a lively discussion on how you can take control and solve the HPC data deluge.