Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Asetek Liquid Cooling Solution Coming to HPC Center

Today Asetek announced a new order from one of its existing OEM partners for its RackCDU D2C (Direct-to-Chip) liquid cooling solution. The order is part of a new installation for an undisclosed HPC customer.

I am very pleased with the progress we are making in our emerging data center business segment,” said André Sloth Eriksen, CEO and founder of Asetek. “This repeat order, from one of our OEM partners, to a new end customer confirms the trust in our unique liquid cooling solutions and that adoption is growing.”

The order will result in revenue to Asetek in the range of USD 300,000 for approximately 15 racks with delivery in Q2 2017. The OEM partner as well as the installation site will be announced at a later date.

RackCDU D2C is a hot water liquid cooling solution that captures between 60% and 80% of server heat, reducing data center cooling cost by over 50% and allowing 2.5x-5x increases in data center server density.

D2C removes heat from CPUs, GPUs, memory modules and other high heat components within servers using water as hot as 40°C (105°F), eliminating the need for expensive and inefficient chilled air to cool these components. As air chilling is the largest portion of data center cooling OpEx and CapEx costs, D2C can free up money from constrained data center budgets for investment in more computing rather than disappearing in support costs.

With RackCDU D2C, less air needs to be cooled and moved by Computer Room Air Handlers (CRAH) or Computer Room Air Conditioning (CRAC) units. Furthermore, liquid cooled servers need less airflow resulting in more energy efficient servers. RackCDU is capable of returning water from the data center at temperatures high enough to enable waste heat recycling. Data centers choosing this option recover a portion of the energy running their servers further increasing energy cost savings, reducing carbon footprint and resulting in cooling Energy Reuse Efficiencies (ERE) below 1.0.

Sign up for our insideHPC Newsletter

Leave a Comment

*

Resource Links: