MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

ESnet Releases Code for Building Online Interactive Network Portals

This map highlights the sites ESnet serves, the structure of the network and the current traffic load.

ESnet has released open source code for building online Interactive Network Portals. “Now that the libraries are made available, the team hopes that other organizations will take the code, use it, add to it and work with ESnet to make the improvements available to the community.”

Video: The Road to Exascale


“Exascale levels of computing pose many system- and application-level computational challenges. Mellanox as a provider of end-to-end communication services is progressing the foundation of the InfiniBand architecture to meet the exascale challenges. This presentation will focus on recent technology improvements which significantly improve InfiniBand’s scalability, performance, and ease of use.”

Radio Free HPC Looks at Highlights from Fall 2015 HPC Conferences


In this podcast, the Radio Free HPC team goes over a Trip Report from Rich Brueckner from insideHPC, who’s been on the road at a series of HPC conferences. We captured more that 50 talks in the past month, and we have them all right here with the very latest in High Performance Computing.

Video: NVLink Interconnect for GPUs


“NVLink enables fast data exchange between CPU and GPU, thereby improving data throughput through the computing system and overcoming a key bottleneck for accelerated computing today. NVLink makes it easier for developers to modify high-performance and data analytics applications to take advantage of accelerated CPU-GPU systems. We think this technology represents another significant contribution to our OpenPOWER ecosystem.”

Mellanox to Acquire EZchip AKA Tilera


Today Mellanox announced a definitive agreement to acquire EZchip, a leader in high-performance processing solutions for carrier and data center networks. You may recognize EZchip from their recent Tilera acquisition, makers of Tile-Gx multicore network processors.

Communication Frameworks for HPC and Big Data

DK Panda, Ohio State University

DK Panda from Ohio State University presented this talk at the HPC Advisory Council Spain Conference. “Dr. Panda and his research group members have been doing extensive research on modern networking technologies including InfiniBand and 10-40GE/iWARP. His research group is currently collaborating with National Laboratories and leading InfiniBand and 10-40GE/iWARP companies on designing various subsystems of next generation high-end systems.”

PNNL Installs Data Vortex System


Today Data Vortex Technologies announced that the company has sold and delivered a DV205 system, “PEPSY”, to the Pacific Northwest National Laboratory (PNNL). PEPSY is specifically designed to solve problems requiring extensive processor-to-processor communication in parallel computing systems.

UCX: Co-Design Architecture For Next Generation HPC Systems


“The UCX Unified Communication X project is a collaboration between industry, laboratories, and academia to create an open-source production grade communication framework for data centric and high-performance applications. At the core of the UCX project are the combined features, ideas, and concepts of industry leading technologies including MXM, PAMI and UCCS. Mellanox Technologies has contributed their MXM technology, which provides enhancements to parallel communication.”

BSC and Integrating Persistent Data and Parallel Programming Models


Toni Cortés from the Barcelona Supercomputing Center presented this talk at the HPC Advisory Council Spain Conference. “BSC is the National Supercomputing Facility in Spain and was officially constituted in April 2005. BSC-CNS manages MareNostrum, one of the most powerful supercomputers in Europe, located at the Torre Girona chapel. The mission of BSC-CNS is to investigate, develop and manage information technology in order to facilitate scientific progress.”

HPC4Health Selects Mellanox InfiniBand for Cancer and Genomics Research


Today Mellanox announced that the HPC4Health Consortium, led by The Hospital for Sick Children (SickKids) and the University Health Network’s Princess Margaret Cancer Centre, has selected its InfiniBand networking solutions to improve patient care and help researchers to optimize treatment with the ultimate goal of finding a cure for cancer. The end-to-end FDR 56Gb/s InfiniBand networking solution was adopted as the foundation of the center’s cancer and genomics program, to accelerate the sharing, processing and analysis of data generated from radiology imaging, medical imaging analysis, protein folding, x-ray diffraction in order to improve patient care and expedite cancer research.