Ralph H. Castain from Intel presented this talk at the Adaptive Computing booth at SC13. “The solution allows customers to leverage both their HPC and big data investments in a single platform, as opposed to operating them in siloed environments. The convergence between big data and HPC environments will only grow stronger as organizations demand data processing models capable of extracting the results required to make data-driven decisions.”
In this video from the TCC Conference, Rich Brueckner from insideHPC describes the convergence of Big Data and HPC. “While the term Big Data has become pervasive in Information Technology, many in the industry are still puzzled by how to make money from this phenomenon. In this talk, Brueckner will look at what’s really behind Big Data as an engine for change with case studies that are bringing the full potential of Big Data home.”
“The speed, accuracy and cost at which enterprises can process big data analytics is the new competitive battleground, and we expect the need for results to greatly impact computing in 2014 and beyond,” said Rob Clyde, CEO of Adaptive Computing. “In our estimation, big data requires a streamlined approach to a complex data analysis and simulation process that can manage all resources across multiple computing platforms.”
“NumbaPro is a powerful compiler that takes high-level Python code directly to the GPU producing fast-code that is the equivalent of programming in a lower-level language. It contains an implementation of CUDA Python as well as higher-level constructs that make it easy to map array-oriented code to the parallel architecture of the GPU.”
“Watson Discovery Advisor uses Watson’s cognitive intelligence to save researchers time, after reading through, determining context and synthesizing vast amounts of data. It helps users pinpoint connections within the data to accelerate their work. Connections are made that are often overlooked within the enormous volume of information available from relevant data sources.”
“In 2014, HPC technology will combine original, un-fragmented data loads with visualization tools to map data—accelerating the creation of relational hypothesis. We will also start to see the first signs of machine learning for enterprise data insights as automated relational analysis enters the Big Data analytics space.”
New innovations from Adaptive Computing include: Moab Task Manager, a localized decision-making tool within Moab’s HPC Suite that enables high-speed throughput on short computing jobs. Adaptive has also announced a partnership with Intel to integrate Moab/TORQUE workload management software with the Intel HPC Distribution for Apache Hadoop software, which combines the Intel Distribution for Apache Hadoop software with the Intel Enterprise Edition of Lustre software.