Iceotope Showcases ExaNeSt Cooling Technology for Exascale

Print Friendly, PDF & Email

Iceotope-600x378In this video from SC15, Peter Hopton from Iceotope describes the company’s innovative liquid cooling technology for the European ExaNeSt project.

“ExaNeSt will develop, evaluate, and prototype the physical platform and architectural solution for a unified Communication and Storage Interconnect and the physical rack and environmental structures required to deliver European Exascale Systems. The consortium brings technology, skills, and knowledge across the entire value chain from computing IP to packaging and system deployment; and from operating systems, storage, and communication to HPC with big data management, algorithms, applications, and frameworks. Building on a decade of advanced R&D, ExaNeSt will deliver the solution that can support exascale deployment in the follow-up industrial commercialization phases.”

Full Transcript:

insideHPC: Peter, we came up here to the SC15 Emerging Technologies Area to see what’s going on with the cutting edge stuff and I was reading about ExaNeSt. What can you tell us about ExaNeSt? What is this thing?

Peter Hopton: ExaNeSt is part of the Europeans race to an exascale scale computing. There are effectively two competing races, one of the races is formed of five projects; ExaNeSt is one of those projects. It’s the project that does the interconnect, the cooling, the power and the mechanicals. It’s then accompanied by a variety of other projects, each doing other items, for example, ExaNoDe are doing the nodes which are RAM based, we’ve got coherent cash and UNIMEM architecture. So as far as ExaNeSt is concerned, we get these different devices at very high density that want to be very closely packed together — so we cool them. We’ve been selected because we can cool things three-dimensionally, we can cool package on package, mezzanine on mezzanine, and that allows that density to be achieved, which helps drive the energy efficiency of the components up.

insideHPC: So this uses a liquid cooling, right? Is it water or some other coolant?

Peter Hopton: The electronics are immersed in a liquid fluorinated plastic. It’s a plastic that is liquid at room temperature and it’s inert. This coolant moves around very quickly as heat is supplied to it. It’s very naturally connective. It’s contained within our blades and when you apply heat from the electronics, it rises, and then it dumps that heat to the shell of the blade where it goes into the cabinet level cooling back plain, using water.

That enables us to have hot water input. With the existing technology that we have– which is commercially available, we’ve demonstrated 53 degrees Celsius inlet water, without any loss of performance on Xeon 2690 processors. So people can have performance exceeding the published Intel LINPACK numbers all the way up to 53 Celsius inlet. Which means, for operators they don’t have to invest in chiller equipment to run the HPC apparatus.

insideHPC: For this ExaNeSt, you are,using ARM processors. I always thought of ARM as low power. It’s in our phones– our phones don’t get hot. What kind of density are we talking about? Hundreds or cores or what?

Peter Hopton: Thousands and thousands of cores.  Yes and  it’s about packing these nice low power cores in tight enough and close enough together sharing a large amount of memory. The memory is being stacked too, so that you end up with a single blade about the size of a  traditional server blade probably emitting around two and a half kilowatts of heat.

insideHPC: So what’s the next step? Is there a prototype coming next?

Peter Hopton: What we have two stages. The first is to make a prototype, which will also become a product based upon ARM processes up to about 800 watts per blade. Then we to go double –sided,  double-density in terms of the cooling in order to move to a system that’s dense enough to achieve exascale. And that will output a prototype in 2018 that will be excess scalable if we achieve our ambitions within the project.

Sign up for our insideHPC Newsletter