In this video from ISC 2017, Gilad Shainer from Mellanox discusses the company’s newest announcements for HPC:
- New Shield Technology Brings Self Healing to Networks
- InfiniBand continues to grow on the TOP500
- RDMA is enabling Machine Learning at Scale
Today Mellanox announced that InfiniBand solutions accelerate the majority of the new TOP500 systems deployed in 2017, two and half times more systems than OmniPath and nearly three times more versus other proprietary interconnect products. InfiniBand accelerates 60 percent of the total HPC systems on the list and 48 percent of the Petaflop infrastructures, an increase from 45 percent as reported in the November 2016 TOP500 list. Moreover, InfiniBand solutions connect world-leading Artificial Intelligence and Deep Learning platforms, including the newly listed platform from Facebook. This demonstrates the strong adoption of InfiniBand and its leading market share in high-performance computing and artificial intelligence. Mellanox continues to connect the fastest supercomputer on the list, delivering highest scalability, performance and efficiency. Mellanox Ethernet solutions also connect several of the 10 Gigabit Ethernet, all of the 40 Gigabit Ethernet systems and the first 100 Gigabit Ethernet system.
InfiniBand’s In-Networking Computing offload advantage delivers the highest applications performance, scalability and robustness, enabling users to maximize their data center return on investment. In just six months, EDR InfiniBand systems have more than doubled on the TOP500 list, and InfiniBand was chosen by far more end-users compared to a proprietary offering, maintaining its market share. In addition, InfiniBand adoption continues to ramp up in Artificial Intelligence and Deep Learning platforms, becoming the natural choice for these solutions,” said Eyal Waldman, president and CEO of Mellanox Technologies. “We are also happy to see more of our 10, 40 and 100G Ethernet solutions on the TOP500 list, resulting in 192 systems using Mellanox interconnect solutions. Finally, we plan to release our next generation 200Gb/s HDR InfiniBand solutions later this year, further increasing the technology advantage of Mellanox, for high-performance computing, cloud, Web2.0, database, deep learning and compute and storage platforms.”
Published twice a year and publicly available at: www.top500.org, the TOP500 list ranks the world’s most powerful computer systems according to the Linpack benchmark rating system. Key takeaways from the list include:
- Mellanox accelerates the fastest supercomputer on the list
- InfiniBand most used HPC interconnect in first half of 2017, connecting 2.5X more new end-user projects versus Omni-Path, and 3X more versus other proprietary products
- Mellanox connects nearly 39 percent of overall TOP500 systems (192 systems, InfiniBand and Ethernet)
- InfiniBand connects 36 percent of the total TOP500 systems (179 systems)
- InfiniBand connects 60 percent of the HPC TOP500 systems
- InfiniBand accelerates 48 percent of the Petascale systems
- EDR InfiniBand installations grew 2.5X in six months
- InfiniBand provides 1.7X higher system efficiency for Petascale systems versus OmniPath
- Mellanox connects all of 40G Ethernet systems
- Mellanox connects the first 100G Ethernet system on the list
- InfiniBand is the most used Interconnect on the TOP500 for TOP100, TOP200, and TOP300 systems
- InfiniBand is the preferred interconnect for Artificial Intelligence and Deep Learning systems
- Mellanox solutions enable highest ROI for Machine Learning, High-Performance, Cloud, Storage, Big Data and more applications
Visit Mellanox Technologies at ISC 2017 (booth #E-911) to learn more about the new 200G HDR InfiniBand solutions and to see the full suite of Mellanox’s end-to-end high-performance InfiniBand and Ethernet solutions.