The Scary Side of AI and Big Data

1c9b78d

Douglas Eadline writes that recent big investments in AI technology by IBM and Google show that intelligent systems are the future of big business. The problem is, these advancements could come at the expense of our privacy.

For Financial Services, it’s HPC to the Rescue

Austin Trippensee

“Today, most financial services organizations try to solve all of their big data challenges using either grid or cluster technologies. One popular approach involves the use of Hadoop running on commodity x86-based clusters. At Cray we’re taking a different approach, leveraging our supercomputing technologies to improve I/O performance, disk utilization and efficiency.”

Slidecast: How Big Workflow Delivers Business Intelligence

rob_clyde

In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow — the convergence of Cloud, Big Data, and HPC in enterprise computing. “The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics,” said Rob Clyde, CEO of Adaptive Computing. “A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage.”

Porting Hadoop to HPC

intel

Ralph H. Castain from Intel presented this talk at the Adaptive Computing booth at SC13. “The solution allows customers to leverage both their HPC and big data investments in a single platform, as opposed to operating them in siloed environments. The convergence between big data and HPC environments will only grow stronger as organizations demand data processing models capable of extracting the results required to make data-driven decisions.”

Rich Brueckner Presents: Big Data – What’s It Really About?

rich

In this video from the TCC Conference, Rich Brueckner from insideHPC describes the convergence of Big Data and HPC. “While the term Big Data has become pervasive in Information Technology, many in the industry are still puzzled by how to make money from this phenomenon. In this talk, Brueckner will look at what’s really behind Big Data as an engine for change with case studies that are bringing the full potential of Big Data home.”

The Time is Now for In-Memory Analytics

Pat_Buddenbaum-tn

“In-memory database and analytics solutions enable significant performance gains in analyzing complex and diverse datasets. We’re talking about analysis in seconds or minutes rather than hours or days. This is how you get to real-time insight.”

Adaptive Computing Top Predictions for Big Data Computing

Adaptive-Logo-Stacked

“The speed, accuracy and cost at which enterprises can process big data analytics is the new competitive battleground, and we expect the need for results to greatly impact computing in 2014 and beyond,” said Rob Clyde, CEO of Adaptive Computing. “In our estimation, big data requires a streamlined approach to a complex data analysis and simulation process that can manage all resources across multiple computing platforms.”

Programming GPUs Directly from Python Using NumbaPro

contin

“NumbaPro is a powerful compiler that takes high-level Python code directly to the GPU producing fast-code that is the equivalent of programming in a lower-level language. It contains an implementation of CUDA Python as well as higher-level constructs that make it easy to map array-oriented code to the parallel architecture of the GPU.”

SGI to Develop SAP HANA In-Memory Computing System

uv20

Today SGI announced plans to develop an in-memory appliance based on the SAP HANA platform. According to SGI, the company’s UV in-memory computing system based on SAP HANA will be scalable to manage the growing computing needs associated with enterprise big data.

IBM Billion-dollar Division to Deliver Watson Through the Cloud

watson_avatar2_140x140

“Watson Discovery Advisor uses Watson’s cognitive intelligence to save researchers time, after reading through, determining context and synthesizing vast amounts of data. It helps users pinpoint connections within the data to accelerate their work. Connections are made that are often overlooked within the enormous volume of information available from relevant data sources.”