Sign up for our newsletter and get the latest HPC news and analysis.

Search Results for: mellanox

Virtualized High Performance Computing with Mellanox FDR and RoCE

Josh Simons

“The HPC community can realize significant benefits from adopting enterprise-capable IT solutions grounded in proven virtualization and cloud technology. And conversely, as business IT environments become increasingly compute-intensive, lessons learned by the scientists and engineers working with HPC can be transferred to their counterparts in the enterprise. It’s a win-win situation.”

Mellanox And ProLabs to Collaborate


Today Mellanox announced that ProLabs will supply the company’s 10/40/56Gb/s InfiniBand and Ethernet QSFP+ and SFP+ cables and modules.

New Mellanox Cables Simplify Datacenter Installations


Today Mellanox announced new Direct Attach Cables with colored jackets and colored pull tabs, supporting interconnect speeds of 10, 40 and 56Gb/s for both Ethernet and InfiniBand data center networks.

Video: Mellanox Announces HPC-X Toolkit to Interconnect Your Future

Scott Schultz

The Mellanox HPC-X Scalable Toolkit is a comprehensive MPI / OpenSHMEM / PGAS / UPC tool suite for high performance computing environments. “HPC-X enables you to rapidly deploy and deliver maximum application performance without the complexity and costs of licensed third-party tools and libraries,” said Scot Schultz, director of HPC and technical computing at Mellanox. “Users can now solve their most complex problems in reduced time and scale their solutions more efficiently.”

Interview: Mellanox Announces World’s First 100Gb/s EDR InfiniBand Switch


In this video from ISC’14, Gilad Shainer from Mellanox provides a technology update. This week the company announced the World’s First 100Gb/s EDR InfiniBand Switch as well as the HPC-X Scalable Software Toolkit.

Mellanox Demonstrates World’s First 100Gb/s EDR InfiniBand Switch


Today Mellanox announced Switch-IB, the next generation of its InfiniBand switch and the first-ever switch IC capable of 100Gb/s per port speeds.

Mellanox & DataON to Provide Cluster-in-a-Box Storage Appliance


This week Mellanox announced that its ConnectX-3 10/40Gb RDMA over Converged Ethernet (RoCE) and FDR 56Gb/s InfiniBand Network Interface Cards (NICs) power DataON Storage Cluster-in-a-Box appliances.

Mellanox Rolls Out Software-Defined FCoE Switch Solution


Today Mellanox announced the world’s fastest and most efficient Fibre Channel over Ethernet (FCoE) switching solution.

NCI Powers Research in the Cloud with Mellanox, RDMA, and OpenStack


Today Mellanox announced that the National Computational Infrastructure (NCI) at the Australia National University has selected the company’s interconnect technologies to support the nation’s researchers.

Mellanox Rolls Out LinkX Cables and Transceivers


Today Mellanox announced its new line of LinkX cables and transceivers supporting interconnect speeds of 10, 40 and 56 gigabit per second for both Ethernet and InfiniBand data center networks.