Search Results for: mellanox

Mellanox Accelerates NVMe/TCP and RoCE Fabrics to 200Gb/s

Today Mellanox announced acceleration of NVMe/TCP at speeds up to 200Gb/s. The entire portfolio of shipping ConnectX adapters supports NVMe-oF over both TCP and RoCE, and the newly-introduced ConnectX-6 Dx and BlueField-2 products also secure NVMe-oF connections over IPsec and TLS using hardware-accelerated encryption and decryption. These Mellanox solutions empower cloud, telco and enterprise data […]

Report: Mellanox ConnectX Ethernet NICs Outperforming Competition

Today Mellanox announced that laboratory tests by The Tolly Group prove its ConnectX 25GE Ethernet adapter significantly outperforms the Broadcom NetXtreme E series adapter in terms of performance, scalability and efficiency. “Our testing shows that with RoCE, storage traffic, and DPDK, the Mellanox NIC outperformed the Broadcom NIC in throughput and efficient CPU utilization. ConnectX-5 also used ‘Zero-Touch RoCE’ to deliver high throughput even with partial and no congestion control, two scenarios where Broadcom declined to be tested.”

Mellanox Announces Support for SONiC Open Source Network Operating System

Today Mellanox announced ASIC-to-Protocol (A2P) customer support solutions for the SONiC Network Operating System (NOS) on Mellanox Spectrum switches. “Every week we hear from more customers who want to combine the power of SONiC with the best-in-class switch silicon in Mellanox Spectrum. Our unique support offering and vast SONiC experience make this easy for new and existing SONiC customers.”

Mellanox to Ship 1 Million ConnectX Adapters in Q3 2019

Today Mellanox announced it is on track to ship over one million ConnectX and BlueField Ethernet network adapters in Q3 2019, a new quarterly record. “Leading data centers worldwide select the award-winning ConnectX and BlueField SmartNICs, to leverage networking speeds of 25, 50, 100, and 200 Gb/s, and take advantage of advanced offload capabilities to accelerate networking, virtualization, storage and security tasks alike — freeing up server CPUs for money-making applications.”

Guardicore and Mellanox to Deliver Agentless and High-Performance Micro-Segmentation in Data Centers

Today Guardicore announced that it has partnered with Mellanox to deliver the first agentless and high-performance, low latency micro-segmentation solution for high speed 10G-100G networks. The solution leverages both the Guardicore Centra security platform and Mellanox BlueField SmartNIC solutions to provide customers with hardware-embedded micro-segmentation security. This integration allows customers using BlueField SmartNICs to support micro-segmentation requirements for high speed networks or when other agent-based solutions cannot be used. The new solution is fully integrated and managed centrally by Guardicore Centra.

Mellanox Rolls Out New LinkX 200G & 400G Cables & Transceivers

Today Mellanox announced new LinkX 100/200/400G cables and transceivers at the China International Optoelectronic Expo (CIOE) September 4th in Shenzhen, China and the European Convention for Optical Communications (ECOC) Sept 21st in Dublin, Ireland. “We’ve had tremendous adoption of our full line of LinkX 25/50/100G cables and transceivers with web-scale, cloud computing, and OEM customers in China and worldwide,” said, Steen Gundersen, vice president LinkX interconnects, Mellanox Technologies. “We are just at the beginning of the transition to 200G and 400G will soon follow. Customers select Mellanox because of our expertise in high-speed interconnects, our capacity to ship in volume, and the high quality of our products.”

Video: Mellanox Rolls Out SmartNICs

In this video, Mellanox CTO Michael Kagan talks about the next step for SmartNICs and the company’s newly released ConnectX-6 Dx product driven by its own silicon. “The BlueField-2 IPU integrates all the advanced capabilities of ConnectX-6 Dx with an array of powerful Arm processor cores, high performance memory interfaces, and flexible processing capabilities in a single System-on-Chip (SoC), supporting both Ethernet and InfiniBand connectivity up to 200Gb/s.”

Mellanox Powers Virtualized Machine Learning with VMware and NVIDIA

Today Mellanox announced that its RDMA (Remote Direct Memory Access) networking solutions for VMware vSphere enable virtualized Machine Learning solutions that achieve higher GPU utilization and efficiency. “As Moore’s Law has slowed, traditional CPU and networking technologies are no longer sufficient to support the emerging machine learning workloads,” said Kevin Deierling, vice president marketing, Mellanox Technologies. “Using hardware compute accelerators such as NVIDIA T4 GPUs and Mellanox’s RDMA networking solutions has proven to boost application performance in virtualized deployments.”

Mellanox to Maximize Performance on AMD EPYC 7002 Processors

Today Mellanox announced that the company’s InfiniBand ConnectX smart adapter solutions are optimized to provide breakthrough performance and scalability for the new AMD EPYC 7002 Series processor-based compute and storage infrastructures. “The combination of Mellanox 25, 50, 100 and 200 Gigabit Ethernet and HDR 200 Gigabit InfiniBand adapters, and PCI Express 4.0 support in the second-generation AMD EPYC processor, provides high-performance computing, artificial intelligence, cloud and enterprise data centers the high data bandwidth they need for the most compute and storage demanding applications.”

Video: Mellanox HDR InfiniBand makes inroads on the TOP500

In this video from ISC 2019, Gilad Shainer from Mellanox describes how HDR InfiniBand technology is proliferating across the TOP500 list of the world’s most powerful supercomputers. “HDR 200G InfiniBand made its debut on the list, accelerating four supercomputers worldwide, including the fifth top-ranked supercomputer in the world located at the Texas Advanced Computing Center, which also represents the fastest supercomputer built in 2019.”