Sign up for our newsletter and get the latest HPC news and analysis.

PSC Goes Heterogeneous with Penguin Super

Today Penguin Computing announced the installation of a new heterogeneous cluster at the University of Pittsburgh. In an intriguing configuration, the system uses a QDR InfiniBand interconnect three types of nodes: Intel Westmere CPUs, AMD Magny-Cours CPUs, and nodes with both Westmere CPUs and NVIDIA Fermi GPUs.

When our research efforts called for a new cluster, there was no question that we would go to Penguin for support,” said Kenneth D. Jordan, Distinguished Professor in the School of Arts & Sciences at the University of Pittsburgh. “We have purchased several computer clusters from Penguin and have been very pleased with their reliability. There is a major advantage in terms of maintaining the various clusters to use the same operating system and management and queue software on all systems.”

The Westmere nodes have 12 cores and 48 GB of memory each and the Magny-Cours nodes have 48 cores and 128 GB of memory each. The choice of hardware was dictated by the mix of calculations carried out by users and networking makes use of QDR Infiniband to enable excellent performance on parallel applications. Similar to earlier Penguin Computing clusters installed at the University of Pittsburgh, this new heterogeneous cluster is managed by Penguin’s Scyld ClusterWare software system. Read the Full Story.

Resource Links: