Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Intel HPC Orchestrator Powers Research at University of Pisa

daviniIn this video, Maurizio Davini from the University of Pisa describe how the University works with Dell EMC and Intel to test new technologies, integrate and optimize HPC systems with Intel HPC Orchestrator software.

We believe these two companies are at the forefront of innovation in high performance computing,” said Maurizio Davini, Chief Technical Officer for the University of Pisa. “We also share a common goal of simplifying HPC to support a broader range of users.”

From quantum chemistry and nanophysics to genome sequencing, proteomics, and engineering simulations, researchers at the University of Pisa are using high performance computing across an increasing variety of research disciplines. HPC has become a foundational resource for more than 200 users who are working to push the boundaries of knowledge at the prestigious university. Demands for HPC are growing beyond traditional HPC disciplines as well, including areas such as big data analytics, data visualization, and machine learning.

This growing array of HPC demands is creating new challenges for the University of Pisa’s IT Center. Research teams use a variety of different applications, many of which have unique software dependencies. Supporting this growing and continuously changing software portfolio requires extensive integration and testing. To reduce the time and administrative overhead for this process, the IT Center is currently utilizing Intel HPC Orchestrator, which provides a pre-integrated HPC software stack designed to greatly simplify software installation and maintenance.

After working with Intel HPC Orchestrator in their test environment, the University of Pisa’s IT staff see great potential for this new approach. According to Davini “University of Pisa’s average HPC system software stack build, with Base OS pre-installed, and validation for 32 Compute Node system may take up to one or two days. The implementation of Intel HPC Orchestrator, with Base OS pre-installed, was accomplished in a few hours.”

The University of Pisa has five small to medium sized clusters onsite, ranging in size from 10 to 60 nodes. They also have an on-premise private cloud based on Dell Hybrid Cloud System for Microsoft. This integrated, Intel architecture-based cloud platform runs Windows* Azure Pack, which allows them to deploy Microsoft Azure* services on premise. Davini and his team have verified that they can use Intel HPC Orchestrator for both bare metal and virtual deployment models, so they can use the same software tools whether they are providing dedicated physical infrastructure or a cloud-based cluster solution.

Research teams with less demanding performance requirements have been thrilled with the private cloud solutions. According to Davini, “We’re seeing a lot of new players in the HPC space that can’t justify the expense of a dedicated physical cluster. We recently had a team of acoustics researchers, for example, that needed a small cluster for just a few weeks. With Intel HPC Orchestrator and our private cloud, we can address these kinds of requests faster and with less effort.”

Intel HPC Orchestrator is based on the OpenHPC community system software stack, which is hosted by the Linux Foundation. OpenHPC includes a comprehensive set of commonly required HPC components, such as provisioning and resource management tools, I/O libraries, and numerous scientific libraries. Intel works with the open source community to deliver new functionality and to optimize code for high performance and reliability on Intel architecture. With Intel HPC Orchestrator, Intel adds value through additional testing, validation, and professional support, plus a number of proprietary software components that aid in system support and application development.

The University of Pisa is a Dell EMC | Intel HPC Innovation Center, available for technical collaboration and early access to technologies.

Sign up for our insideHPC Newsletter

Leave a Comment

*

Resource Links: