Douglas Eadline writes that recent big investments in AI technology by IBM and Google show that intelligent systems are the future of big business. The problem is, these advancements could come at the expense of our privacy.
“For environments where large memory systems are critical – bio informatics, legacy databases i.e. Big Data, we have focused on a lot of performance enhancements. We strive to make large memory systems as fast as possible. It is interesting to note that in some cases, our VMs are faster than physical machines. We do this by prefetching and caching data based on our understanding of memory placement and access patterns.”
“The INCITE and PRACE programs give access to increasing resources allowing these technologies to be applied to industrial scale systems. From past and ongoing research examples performed at CERFACS, this presentation highlights the scientific breakthroughs allowed by HPC on exascale machines for reacting flows for gas turbines and explosions in buildings.”
“Today, most financial services organizations try to solve all of their big data challenges using either grid or cluster technologies. One popular approach involves the use of Hadoop running on commodity x86-based clusters. At Cray we’re taking a different approach, leveraging our supercomputing technologies to improve I/O performance, disk utilization and efficiency.”
In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow — the convergence of Cloud, Big Data, and HPC in enterprise computing. “The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics,” said Rob Clyde, CEO of Adaptive Computing. “A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage.”
Ralph H. Castain from Intel presented this talk at the Adaptive Computing booth at SC13. “The solution allows customers to leverage both their HPC and big data investments in a single platform, as opposed to operating them in siloed environments. The convergence between big data and HPC environments will only grow stronger as organizations demand data processing models capable of extracting the results required to make data-driven decisions.”
Over at the XSEDE blog, Scott Gibson writes that the organization is collaborating with industrial partners to both advance open science and improve companies’ bottom lines. The “Industry Challenge” is a new XSEDE program designed to bring the scientific and industrial communities together in multidisciplinary collaborative teams and connect them with world-class advanced digital services. […]
In this video from the TCC Conference, Rich Brueckner from insideHPC describes the convergence of Big Data and HPC. “While the term Big Data has become pervasive in Information Technology, many in the industry are still puzzled by how to make money from this phenomenon. In this talk, Brueckner will look at what’s really behind Big Data as an engine for change with case studies that are bringing the full potential of Big Data home.”