HDR 200Gb/s InfiniBand: The Key to Success for Supercomputers Around the World

Print Friendly, PDF & Email

In recent years, High Performance Computing (HPC) has accomplished significant achievements that were once thought impossible in a variety of scientific and medical fields. Despite the growing list of breakthroughs these HPC systems has enabled, performance and scalability continues to be top of mind when it comes to unlocking their full potential.

Remote Direct Memory Access (RDMA) fabrics, such as InfiniBand and RDMA over Converged Ethernet (RoCE), are critical to overcoming the scalability and performance challenges for the most data-intensive workloads being developed and deployed today.  As a mature and field-proven technology, InfiniBand is used in thousands of data centers, high-performance compute clusters and embedded applications that scale from two nodes up to clusters utilizing thousands of nodes.

The InfiniBand Trade Association (IBTA) is chartered with maintaining and furthering the InfiniBand architecture specification which is an industry-standard specification. The IBTA is led by a distinguished steering committee comprised of companies such as Broadcom, HPE, IBM, Intel, Marvell, Microsoft and NVIDIA. The members represent leading enterprise IT vendors that are actively contributing to the advancement of the InfiniBand specification, which adapts to meet the industry demands and sets goals for future speed increases. InfiniBand is extremely flexible and enables innovative system designs, allowing the development of new topologies that can optimize enterprise data centers and HPC systems based on application-specific workloads and communication patterns. This not only maximizes application and overall system performance, but it also showcases a clear return on investment.

The InfiniBand Roadmap provides clear timelines and expectations for performance improvements. As HDR 200Gb/s InfiniBand is shipping today, the current roadmap shows a projected demand for increasingly higher bandwidth with new NDR 1.2Tb/s InfiniBand products planned for 2020. The IBTA also ensures interoperability guidelines to guarantee forwards and backwards compatibility across generations. Unlike proprietary network technologies, this ensures previous investments made in HPC systems will be protected.

Today, InfiniBand accelerates most of the world’s most powerful supercomputers and is the de-facto interconnect standard for HPC systems. From the small to large-scale scientific simulations to deployments that are extremely data-driven making real-time critical decisions, InfiniBand offers unmatched efficiency and reliability, compute-heavy applications such as artificial intelligence, machine learning and deep learning. InfiniBand allows users to extend their systems while leveraging existing investments in infrastructure and software. As a result, InfiniBand maintains ongoing leadership in networking and performance scalability over proprietary technologies for compute intensive applications. InfiniBand was already an industry-leading interconnect, and now with the IBTA’s latest speed, HDR 200Gb/s InfiniBand, it remains the fastest end-to-end interconnect technology currently on the market.

According to the latest TOP500 List, HDR 200Gb/s InfiniBand accelerates 31 percent of new InfiniBand-based systems and powers the fastest supercomputer built in 2019. In addition to accelerating the world’s top systems, HDR 200Gb/s InfiniBand is utilized globally by universities, meteorologists, smart city developers and datacenter architects.

The widespread adoption of HDR 200Gb/s InfiniBand highlights the industry’s need for top tier performance in today’s most complex systems. The following use cases illustrate the scope of systems leveraging HDR 200Gb/s InfiniBand to further essential research and technological development.

  • Microsoft’s Azure HBv2-series Virtual Machines (VM) are the first public cloud to feature HDR 200Gb/s InfiniBand. HBv2 VMs support a variety of HPC workloads, such as weather prediction and computational fluid dynamics.
  • The most powerful supercomputer in Norway, the BullSequana XH2000, is accelerated by HDR 200Gb/s InfiniBand to enable research initiatives ranging from modeling climate change and discovering new cancer-fighting drugs, to gaining a better understanding of the origins of the universe.
  • The Center for Development of Advanced Computing (C-DAC) chose HDR 200Gb/s InfiniBand for India’s national supercomputing initiative. InfiniBand’s high data throughput, low latency, and scalability will support India’s digital transformation mission focused on advancing research, technology, and product development capabilities.
  • Finland’s CSC – IT Center for Science Ltd., will deploy HDR 200Gb/s InfiniBand to enable Finnish researchers in universities and research institutes to investigate climate science, renewable energy, astrophysics, nanomaterials, and bioscience, among a wide range of exploration activities.
  • Gadi,” a supercomputer that will support some of Australia’s most critical scientific research, is leveraging HDR 200Gb/s InfiniBand to provide high performance and speeds. HDR 200Gb/s InfiniBand can enable deep learning platforms’ data aggregation operation to be offloaded and accelerated by an InfiniBand network, resulting in improved performance by two times.
  • The University of Tsukuba’s Center for Computational Sciences (CCS) has installed “Cygnus,” an 80-node cluster. HDR 200Gb/s InfiniBand allows Cygnus to perform accelerated research in the areas of astrophysics, particle physics, material science, AI and meteorology.
  • HDR 200Gb/s InfiniBand was selected by the European Centre for Medium-Range Weather Forecasts (ECMWF) to power their new supercomputer. This HPC system will be one of the world’s most powerful meteorological supercomputers, supporting weather forecasting and prediction researchers from over 30 countries across Europe. HDR 200Gb/s InfiniBand enables running nearly two times higher-resolution probabilistic weather forecasts in under an hour, which improves the ability to monitor and predict progressively severe weather phenomena and empowers European countries to establish safety measures to protect their citizens and property.
  • The University of Michigan is leveraging HDR 200Gb/s InfiniBand for projects within “MCity,” a multidisciplinary program aiming to create smarter cities, with an emphasis on the role of intelligent, connected vehicles. With the help of HDR 200Gb/s InfiniBand, any project that needs multiple nodes will be supported by the fastest data transfer speeds and enhanced performance for each application.
  • OpenStack software now features native and upstream support for virtualization over an HDR 200Gb/s InfiniBand network. This addition allows customers to construct high-performance OpenStack-based cloud services over the leading interconnect infrastructure.

HDR 200Gb/s InfiniBand delivers the interconnect industry’s highest data throughput, extremely low latency and world-leading performance to HPC systems across the globe. With a foundation based on HDR 200Gb/s InfiniBand, high-performance systems are changing the way we understand the world we live in through scientific discoveries, environmental research, advanced medical research and realizing the potential for innovation in countless areas of business that are sure to drive change within the social and global landscape of tomorrow.

Sign up for our insideHPC Newsletter