Douglas Eadline writes that recent big investments in AI technology by IBM and Google show that intelligent systems are the future of big business. The problem is, these advancements could come at the expense of our privacy.
“Today, most financial services organizations try to solve all of their big data challenges using either grid or cluster technologies. One popular approach involves the use of Hadoop running on commodity x86-based clusters. At Cray we’re taking a different approach, leveraging our supercomputing technologies to improve I/O performance, disk utilization and efficiency.”
In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow — the convergence of Cloud, Big Data, and HPC in enterprise computing. “The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics,” said Rob Clyde, CEO of Adaptive Computing. “A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage.”
Ralph H. Castain from Intel presented this talk at the Adaptive Computing booth at SC13. “The solution allows customers to leverage both their HPC and big data investments in a single platform, as opposed to operating them in siloed environments. The convergence between big data and HPC environments will only grow stronger as organizations demand data processing models capable of extracting the results required to make data-driven decisions.”
In this video from the TCC Conference, Rich Brueckner from insideHPC describes the convergence of Big Data and HPC. “While the term Big Data has become pervasive in Information Technology, many in the industry are still puzzled by how to make money from this phenomenon. In this talk, Brueckner will look at what’s really behind Big Data as an engine for change with case studies that are bringing the full potential of Big Data home.”
“The speed, accuracy and cost at which enterprises can process big data analytics is the new competitive battleground, and we expect the need for results to greatly impact computing in 2014 and beyond,” said Rob Clyde, CEO of Adaptive Computing. “In our estimation, big data requires a streamlined approach to a complex data analysis and simulation process that can manage all resources across multiple computing platforms.”
“NumbaPro is a powerful compiler that takes high-level Python code directly to the GPU producing fast-code that is the equivalent of programming in a lower-level language. It contains an implementation of CUDA Python as well as higher-level constructs that make it easy to map array-oriented code to the parallel architecture of the GPU.”
“Watson Discovery Advisor uses Watson’s cognitive intelligence to save researchers time, after reading through, determining context and synthesizing vast amounts of data. It helps users pinpoint connections within the data to accelerate their work. Connections are made that are often overlooked within the enormous volume of information available from relevant data sources.”