In this video, Mellanox CTO Michael Kagan talks about the next step for SmartNICs and the company’s newly released ConnectX-6 Dx product driven by its own silicon. “The BlueField-2 IPU integrates all the advanced capabilities of ConnectX-6 Dx with an array of powerful Arm processor cores, high performance memory interfaces, and flexible processing capabilities in a single System-on-Chip (SoC), supporting both Ethernet and InfiniBand connectivity up to 200Gb/s.”
Mellanox HDR 200G InfiniBand Speeds Machine Learning with NVIDIA
Today Mellanox announced that its HDR 200G InfiniBand with the “Scalable Hierarchical Aggregation and Reduction Protocol” (SHARP) technology has set new performance records, doubling deep learning operations performance. The combination of Mellanox In-Network Computing SHARP with NVIDIA 100 Tensor Core GPU technology and Collective Communications Library (NCCL) deliver leading efficiency and scalability to deep learning and artificial intelligence applications.
HDR InfiniBand Technology Reshapes the World of High-Performance and Machine Learning Platforms
“The recent announcement of HDR InfiniBand included the three required network elements to achieve full end-to-end implementation of the new technology: ConnectX-6 host channel adapters, Quantum switches and the LinkX family of 200Gb/s cables. The newest generations of InfiniBand bring the game changing capabilities of In-Network Computing and In-Network Memory to further enhance the new paradigm of Data-Centric data centers – for High-Performance Computing, Machine Learning, Cloud, Web2.0, Big Data, Financial Services and more – dramatically increasing network scalability and introducing new accelerations for storage platforms and data center security.”