MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Agenda Posted for HPC User Forum in Tucson, April 11-13

IDC has published the agenda for their next HPC User Forum. The event will take place April 11-13 in Tucson, AZ. “Don’t miss the chance to hear top experts on these high-innovation, high-growth areas of the HPC market. At this meeting, you’ll also hear about government initiatives to get ready for future-generation supercomputers, machine learning, and High Performance Data Analytics.”

Auburn University Launches Hopper Supercomputer from Lenovo

Today Auburn University unveiled its new $1 million supercomputer that will enhance research across campus, from microscopic gene sequencing to huge engineering tasks. The university is also initiating a plan to purchase a new one every few years as research needs evolve and expand.

Saving East African Crops with Supercomputing

“Because the silverfly species are identical to look at, the best way to distinguish them is by examining their genetic difference, so we are deploying a mix of genomics, supercomputing, and evolutionary history. This knowledge will help African farmers and scientists distinguish between the harmless and the invasive ones, develop management strategies, and breed new whitefly-resistant strains of cassava. The computational challenge for our team is in processing the genomic data the sequencing machines produce.”

Chalk Talk: What is a Data Lake?

“If you think of a data mart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.” These “data lake” systems will hold massive amounts of data and be accessible through file and web interfaces. Data protection for data lakes will consist of replicas and will not require backup since the data is not updated. Erasure coding will be used to protect large data sets and enable fast recovery. Open source will be used to reduce licensing costs and compute systems will be optimized for map reduce analytics. Automated tiering will be employed for performance and long-term retention requirements. Cold storage, storage that will not require power for long-term retention, will be introduced in the form of tape or optical media.”

Registration Opens for Inaugural Nimbix Developer Summit in Dallas

Registration is now open for the inaugural Nimbix Developer Summit. With an impressive lineup of speakers & sponsors from Mellanox, migenius, Xilinx, and more, the event takes place March 15 in Dallas, Texas. “The summit agenda will feature topics such as hardware acceleration, coprocessing, photorealistic rendering, bioinformatics, and high performance analytics. The sessions will conclude with a panel of developers discussing how to overcome challenges of creating and optimizing cloud-based applications.”

Anton 2 Supercomputer to Speed Molecular Simulations at PSC

Today the Pittsburgh Supercomputing Center (PSC) announced a $1.8-million National Institutes of Health grant to make the next-generation Anton 2 supercomputer developed by D. E. Shaw Research (DESRES) available to the biomedical research community. A specialized system for modeling the function and dynamics of biomolecules, the Anton 2 machine at PSC will be the only one of its kind publicly available to U.S. scientists. The grant also extends the operation of the Anton 1 supercomputer currently at PSC until the new Anton 2 is deployed, expected in the Fall of 2016.

CCRT in France Acquires 1.4 Petaflop “Cobalt” Supercomputer from Bull

Today Atos announced that the French CEA and its industrial partners at the Centre for Computing Research and Technology, CCRT, have invested in a new 1.4 petaflop Bull supercomputer. “Three times more powerful than the current computer at CCRT, the new system will be installed in the CEA’s Very Large Computing Centre in Bruyères-le-Châtel, France, mid-2016 to cover expanding industrial needs. Named COBALT, the new Intel Xeon-based supercomputer will be powered by over 32,000 compute cores and storage capacity of 2.5 Petabytes with a throughput of 60 GB/s.”

ALCF Celebrates 10 Years of Leadership Computing

This week, the Argonne Leadership Computing Facility (ALCF) turns one decade old. ALCF is home to Mira, the world’s fifth-fastest supercomputer, along with teams of experts that help researchers from all over the world perform complex simulations and calculations in almost every branch of science. To celebrate its 10th anniversary, Argonne is highlighting 10 accomplishments since the facility opened its doors.

Mira Supercomputer Shaping Fusion Plasma Research

The IBM Blue Gene/Q supercomputer Mira, housed at the Argonne national laboratory Argonne Leadership Computing Facility (ACLF), is delivering new insights into the physics behind nuclear fusion, helping researchers to develop a new understanding of the electron behavior in edge plasma – a critical step to creating an efficient fusion reaction.

Poznan Launches Eagle Supercomputer with Liquid Cooling from CoolIT Systems

Today CoolIT Systems announced that it has successfully completed the second deployment of its Rack DCLC liquid cooling solution at the Poznan Supercomputing and Networking Center (PSNC) in partnership with Huawei. “We are pleased to have migrated from a liquid cooled pilot project with CoolIT Systems to a full-scale rollout,” said Radoslaw Januszewski, IT Specialist at PSNC. “The pilot project proved to be very reliable, it met our efficiency goals, and provided a bonus performance boost with the processors very happy to be kept at a cool, consistent temperature as a result of liquid cooling’s effectiveness.”