IBM has further strengthened its portfolio of cloud solutions, following its announcement last week of a new pricing model for its high-performance cloud storage systems.
In this whitepaper from Adaptive Computing – we learn about the new concept around Big Workflow and how it directly addresses the needs of critical, data-intensive, applications. By creating more intelligence around data control, Big Workflow directly provides a way for big data, HPC, and cloud environments to interoperate, and do so dynamically based on what applications are running.
In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow — the convergence of Cloud, Big Data, and HPC in enterprise computing. “The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics,” said Rob Clyde, CEO of Adaptive Computing. “A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage.”
Over at the XSEDE blog, Scott Gibson writes that the organization is collaborating with industrial partners to both advance open science and improve companies’ bottom lines. The “Industry Challenge” is a new XSEDE program designed to bring the scientific and industrial communities together in multidisciplinary collaborative teams and connect them with world-class advanced digital services. […]