RWTH Aachen University Deploys NEC LX Supercomputer

Print Friendly, PDF & Email

Today NEC Corporation announced the deployment of an LX supercomputer at RWTH Aachen University in Germany. Featuring offering high performance computing services for engineering and scientific research.

We selected the NEC LX technology because of its superior performance, as well as low total cost of ownership due to innovative cooling technology,” said Professor Matthias Muller, head of the IT Center at RWTH Aachen University. “Going forward, we are excited to work with NEC as a strong corporate partner in expanding our IT research.”

RWTH is one of Europe’s leading universities, with a specific approach to connecting engineering and natural sciences. The University optimizes and accelerates the development process for new products in an integrated approach that is supported by mathematical modeling and simulation, such as the application of sciences, including fluid dynamics and material science. The efficient use of high performance computing capabilities enables the development of simulation techniques for the flow of materials throughout the entirety of industry process chains.

Built around the new Intel Xeon E5-2600 processors, the NEC system comprises large memory SMP nodes and dual socket MPI compute nodes, with a total of over 19,000 computational cores capable of delivering a performance of more than 600 Teraflops. The nodes are connected through a high speed Intel Omni-Path network with a topology that allows continuous expansion of the system, matching continuous growth of HPC demands from RWTH researchers.

The system features a 4 PetaByte NEC LXFS-z parallel file-system capable of 60 GigaByte/s bandwidth and a scalable ZFS-based Lustre solution with advanced data integrity features.

According to NEC, the system is extremely energy efficient. The servers are entirely cooled with facility water, which reduces the air cooling requirements for the computer room by 90 percent. The cooling system allows water inlet temperatures up to 30?C and the cluster can be cooled for free most of the year through a heat exchanger that does not require external cooling power.

Sign up for our insideHPC Newsletter