Sandia National Laboratories has already seen the benefits from a major Asetek liquid cooled HPC system that has been in use for over twelve months. The 600 teraflop Sky Bridge Supercomputer with 1,848 nodes was installed using Asetek D2C in a Cray CS300-LC supercomputer cluster. With RackCDU D2C, air heat-load was cut by more than 70%, making mechanical upgrade of data center cooling unnecessary and allowing more investment in compute.
Today Asetek announced plans to showcase its liquid cooling solutions and successful installations at ISC 2016 in Frankfurt. “For the first time, Asetek will be displaying its new InRackCDU cooling solution. InRackCDU provides the option of having RackCDU mounted in the server rack. InRackCDU does not take up aisle space and includes the same monitoring features as Asetek’s VerticalRackCDU.”
Asetek will highlight its liquid cooling solutions and successful installations at ISC 2016 next week in Frankfurt. For the first time, Asetek will be displaying its new InRackCDU cooling solution. InRackCDU provides the option of having RackCDU mounted in the server rack. InRackCDU does not take up aisle space and includes the same monitoring features as Asetek’s VerticalRackCDU.
Expected later in 2016, Intel will be releasing production versions of its Knights Landing (KNL) 72-core coprocessor. These next generation coprocessors are impacting the physical design of the supercomputers now coming down the pike in a number of ways. One of the most dramatic changes is the significant increase in cooling requirements – these are high wattage chips that run very hot and present some interesting engineering challenges for systems designers.
Although liquid cooling is considered by many to be the future for data centers, the fact remains that there are some who do not yet need to make a full transformation to liquid cooling. Others are restricted until the next budget cycle. Whatever the reason, new technologies like Internal Loop are more affordable than liquid cooling and can replaces less efficient air coolers. This enables HPC data centers to still utilize the highest performing CPUs and GPUs.
Asetek showcased its full range of RackCDU hot water liquid cooling systems for HPC data centers at SC15 in Austin. On display were early adopting OEMs such as CIARA, Cray, Fujitsu, Format and Penguin. HPC installations from around the world incorporating Asetek RackCDU D2C (Direct-to-Chip) technology were also be featured. In addition, liquid cooling solutions for both current and future high wattage CPUs and GPUs from Intel, Nvidia and OpenPower were on display.
Of the varied approaches to liquid cooling, most remain technical curiosities and fail to show real-world adoption in any significant degree. In contrast, both Asetek RackCDU D2C™ (Direct-to-Chip) and Internal Loop Liquid Cooling are seeing accelerating adoption both by OEMs and end users.
Today Asetek announced an OEM purchase agreement with HPC vendor Penguin Computing. As a part of the agreement, Penguin will incorporate Asetek’s RackCDU D2C liquid cooling technology into its Tundra Extreme Scale (ES) HPC server product line. RackCDU direct-to-chip hot water liquid cooling enhances Penguin’s ability to provide HPC solutions with extreme energy efficiency and higher rack cluster densities. The agreement has already resulted in an order by Penguin described in a previously anonymous announcement.
Today Asetek announced its biggest purchase order to date for its RackCDU data center liquid cooling system. The order was placed by an undisclosed Original Equipment Manufacturing partner. The order for 21 RackCDU with Direct-to-Chip cooling loops is to satisfy an undisclosed OEM customer installation. Both the OEM and the end user will be announced when the information becomes public.
“The range of cooling options now available is testimony to engineering ingenuity. HPC centers can choose between air, oil, dielectric fluid, or water as the heat-transfer medium. Opting for something other than air means that single or two-phase flow could be available, opening up the possibilities of convective or evaporative cooling and thus saving the cost of pumping the fluid round the system.”