Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


GPCNeT or GPCNoT?

In this special guest feature, Gilad Shainer from Mellanox Technologies writes that the new GPCNeT benchmark is actually a measure of relative performance under load rather than a measure of absolute performance. “When it comes to evaluating high-performance computing systems or interconnects, there are much better benchmarks available for use. Moreover, the ability to benchmark real workloads is obviously a better approach for determining system or interconnect performance and capabilities. The drawbacks of GPCNeT benchmarks can be much more than its benefits.”

OpenStack Adds Native Upstream Support for HDR InfiniBand

Today Mellanox announced that OpenStack software now includes native and upstream support for virtualization over HDR 200 gigabit InfiniBand network, enabling customers to build high-performance OpenStack-based cloud services over the most enhanced interconnect infrastructure, taking advantage of InfiniBand’s extremely low latency, high data-throughput, In-Network Computing and more.

HDR 200GB/sec InfiniBand for HPC & AI

In this video from the DDN booth at SC19, Scot Schultz from Mellanox presents: Connecting Visions: HDR 200GB/sec InfiniBand. “HDR 200Gb/s InfiniBand accelerates 31 percent of new InfiniBand-based systems on the current TOP500, including the fastest TOP500 supercomputer built this year. The results also highlight InfiniBand’s continued position in the top three supercomputers in the world and acceleration of six of the top 10 systems. Since the TOP500 List release in June 2019, InfiniBand’s presence has increased by 12 percent, now accelerating 141 supercomputers on the List.”

Mellanox HDR 200G InfiniBand is powering next-gen supercomputers

Today Mellanox announced that HDR 200G InfiniBand is powering the next generation of supercomputers world-wide, enabling higher levels of research and scientific discovery. HDR 200G InfiniBand solutions include the ConnectX-6 adapters, Mellanox Quantum switches, LinkX cables and transceivers and software packages. With its highest data throughput, extremely low latency, and smart In-Network Computing acceleration engines, HDR InfiniBand provides world leading performance and scalability for the most demanding compute and data applications.

Video: How Intel Data-Centric Technologies will power the Frontera Supercomputer at TACC

In this video, researchers from the Texas Advanced Computing Center describe how Intel data-centric technologies power the Frontera supercomputer, which is currently under installation. “This system will provide researchers the groundbreaking computing capabilities needed to grapple with some of science’s largest challenges. Frontera will provide greater processing and memory capacity than TACC has ever had, accelerating existing research and enabling new projects that would not have been possible with previous systems.”

Video: Mellanox Takes HPC Interconnects to the Next Level at SC17

In this video, Gilad Shainer from Mellanox describes the company’s advanced interconnect technologies that were on display at SC17 in Denver. “Mellanox is leading industry innovation by providing the highest throughput and lowest latency HDR 200Gb/s InfiniBand and Ethernet solutions available today, with a clear roadmap for tomorrow.”

HDR InfiniBand Technology Reshapes the World of High-Performance and Machine Learning Platforms

“The recent announcement of HDR InfiniBand included the three required network elements to achieve full end-to-end implementation of the new technology: ConnectX-6 host channel adapters, Quantum switches and the LinkX family of 200Gb/s cables. The newest generations of InfiniBand bring the game changing capabilities of In-Network Computing and In-Network Memory to further enhance the new paradigm of Data-Centric data centers – for High-Performance Computing, Machine Learning, Cloud, Web2.0, Big Data, Financial Services and more – dramatically increasing network scalability and introducing new accelerations for storage platforms and data center security.”

What’s Next for HPC? A Q&A with Michael Kagan, CTO of Mellanox

As an HPC technology vendor, Mellanox is in the business of providing the leading-edge interconnects that drive many of the world’s fastest supercomputers. To learn more about what’s new for SC16, we caught up with Michael Kagan, CTO of Mellanox. “Moving InfiniBand beyond EDR to HDR is critical not only for HPC, but also for the numerous industries that are adopting AI and Big Data to make real business sense out the amount of data available and that we continue to collect on a daily basis.”

Mellanox Brings HDR to SC16 while Dominating Today’s TOP500

“InfiniBand’s advantages of highest performance, scalability and robustness enable users to maximize their data center return on investment. InfiniBand was chosen by far more end-users compared to a proprietary offering, resulting in a more than 85 percent market share. We are happy to see our open Ethernet adapter and switch solutions enable all of the 40G and the first 100G Ethernet systems on the TOP500 list, resulting in overall 194 systems using Mellanox for their compute and storage connectivity.”

Slidecast: Mellanox Announces 200Gb/s HDR InfiniBand Solutions

In this slidecast, Gilad Shainer from Mellanox announces the world’s first HDR 200Gb/s data center interconnect solutions. “These 200Gb/s HDR InfiniBand solutions maintain Mellanox’s generation-ahead leadership while enabling customers and users to leverage an open, standards-based technology that maximizes application performance and scalability while minimizing overall data center total cost of ownership. Mellanox 200Gb/s HDR solutions will become generally available in 2017.”