The Ohio Supercomputer Center is seeking a Web & Interface Applications Group Manager in our Job of the Week.
In this whitepaper from Adaptive Computing – we learn about the new concept around Big Workflow and how it directly addresses the needs of critical, data-intensive, applications. By creating more intelligence around data control, Big Workflow directly provides a way for big data, HPC, and cloud environments to interoperate, and do so dynamically based on what applications are running.
In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow — the convergence of Cloud, Big Data, and HPC in enterprise computing. “The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics,” said Rob Clyde, CEO of Adaptive Computing. “A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage.”
Over at the XSEDE blog, Scott Gibson writes that the organization is collaborating with industrial partners to both advance open science and improve companies’ bottom lines. The “Industry Challenge” is a new XSEDE program designed to bring the scientific and industrial communities together in multidisciplinary collaborative teams and connect them with world-class advanced digital services. […]
Burak Yenier presented this talk at the Stanford HPC Conference. “The UberCloud HPC Experiment brings together over 1000 participants to pave the way for HPC as a Service. Active project teams explore the end-to-end process of accessing remote computing resources in HPC centers and in HPC Clouds to study and overcome the potential roadblocks for industry applications such as CAE, Bio, and Life Sciences.”