Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Asetek Lands Nine Installations on the Green500

Asetek_300x250-SeptemberToday Asetek announced that RackCDU D2C (Direct-to-Chip ) liquid cooling technology is cooling nine installations in the November 2016 Green500 list of the world’s most energy efficient supercomputers.

As seen at installations included on both the Green500 and Top500 lists, Asetek’s distributed liquid cooling architecture enables cluster energy efficiency in addition to sustained and un-throttled cluster performance,” said John Hamill, Vice President of WW Sales and Marketing. “Around the world, data centers are increasingly using Asetek technology for High Performance Computing while reducing energy costs.”

The Green500 list ranks the top 500 supercomputers in the world by energy efficiency. The focus of performance-at-any-cost computer operations has led to the emergence of supercomputers that consume vast amounts of electrical power and produce so much heat that large cooling facilities must be constructed to ensure proper performance. To address this trend, the Green500 list puts a premium on energy-efficient performance for sustainable supercomputing.

Ranked #5 on the list, The University of Regensburg QPACE3 is a joint research project (SFB/TRR-55) with The University of Wuppertal and Jülich Supercomputing Center. Featuring Asetek liquid cooled Fujitsu PRIMERGY servers, it is one of the first Intel Xeon Phi based HPC clusters in Europe. Additionally, the recently announced QPACE3 is ranked #375 on the Top500, bringing the number of Asetek enabled systems on the Top500 to nine.

Green500Ranked #6 on the Green500, Oakforest-PACS is also the highest performance supercomputer system in Japan and ranked #5 on the Top500. The installation features high density Asetek liquid cooled Fujitsu PRIMERGY KNL nodes installed at the Joint Center for Advanced High-Performance Computing (JCAHPC) in conjunction with University of Tokyo and Tsukuba University.

Several Asetek liquid cooled Penguin Computing Tundra clusters under the National Nuclear Security Administration’s tri-laboratory Commodity Technology Systems program (CTS-1) are Green500 systems. These clusters are located at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), and Sandia National Laboratories (SNL). They include SNL Cayenne (#47), SNL Serrano (#48), LANL Grizzly (#49) and LLNL Topaz (#66).

Ranked #251 is the Fujitsu installation at A*STAR Computational Resource Centre (A*CRC) and National Super Computing Centre (NSCC) in Singapore. Located at the equator, this installation demonstrates that Asetek’s data center liquid cooling technology can provide benefits in even the warmest climates

At #281 and #61 are installations at Sandia National Laboratory (Sky Bridge) and Mississippi State University (Shadow) utilizing the Cray CS-300LC . Both of these sites have been on the list since 2015 and 2014 respectively.

Asetek liquid cooling is currently available to data centers around the globe through its network of OEM partners.

In this video from SC16, Steve Branton from Asetek describes the company’s innovative liquid cooling systems for high performance computing.

Sign up for our insideHPC Newsletter

Leave a Comment

*

Resource Links: