Linux Foundation Announces OpenHPC Collaborative Project

Print Friendly, PDF & Email

 

IMG_0019

Today the Linux Foundation announced plans to form the OpenHPC Collaborative Project. This project will provide a new, open source framework to support the world’s most sophisticated HPC environments.

The use of open source software is central to HPC, but lack of a unified community across key stakeholders – academic institutions, workload management companies, software vendors, computing leaders – has caused duplication of effort and has increased the barrier to entry,” said Jim Zemlin, executive director, The Linux Foundation. “OpenHPC will provide a neutral forum to develop one open source framework that satisfies a diverse set of cluster environment use-cases.”

The new initiative includes support from Allinea Software, Altair, ANSYS, Argonne, Atos, Barcelona Supercomputing Center, The Center for Research in Extreme Scale Technologies at Indiana University, Cray, Dell, Fujitsu Systems Europe, Hewlett Packard Enterprise, Intel, Jülich Supercomputing Centre, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, Leibniz Supercomputing Center, Lenovo, Los Alamos National Laboratory, MSC Software, NEC, Pittsburgh Supercomputing Center, Sandia National Laboratories, SENAI CIMATEC, SUSE and Texas Advanced Computing Center.

For more than four decades, High Performance Computing has been used by universities and research centers for large-scale modeling and calculations required in meteorology, astronomy, engineering and nuclear physics, among others. With unique application demands and parallel runtime requirements, software remains one of the biggest challenges for HPC user adoption (See IDC Worldwide HPC Server 2015–2019 Forecast). Open source and Linux-based software components have become a standard way to reliably test and maintain stable operating conditions while providing a cost-effective means for scaling with data growth.

OpenHPC will provide a new, open source framework for HPC environments. This will consist of upstream project components, tools and interconnections to enable the software stack. The community will provide an integrated and validated collection of HPC components that can be used to provide a full-featured reference HPC software stack available to developers, system administrators and users. As an open source and framework-agnostic software stack, OpenHPC will provide flexibility for multiple configurations and scalability to meet a wide variety of user needs.

OpenHPC members plan to work together to:

  • Create a stable environment for testing and validation: The community will benefit from a shared, continuous integration environment, which will feature a build environment and source control; bug tracking; user and developer forums; collaboration tools; and a validation environment.
  • Reduce Costs: By providing an open source framework for HPC environments, the overall expense of implementing and operating HPC installations will be reduced.
  • Provide a robust and diverse open source software stack: OpenHPC members will work together on the stability of the software stack, allowing for ongoing testing and validation across a diverse range of use cases.
  • Develop a flexible framework for configuration: The OpenHPC stack will provide a group of stable and compatible software components that are continually tested for optimal performance. Developers and end users will be able to use any or all of these components depending on their performance needs, and may substitute their own preferred components to fit their own use cases.

Cray has a long history of collaborating with other technology companies and our customers to deliver supercomputing breakthroughs, and we see the OpenHPC project as a great example of this type of collaborative innovation,” said Steve Scott,  Cray SVP and CTO. “Cray Cray plans on contributing some of our unique software to the OpenHPC software stack and using components of that stack in our systems to benefit our customers and their end-users.”

Sign up for our insideHPC Newsletter