Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Micron Joins CERN openlab

Last week at SC18, Micron announced that the company has joined CERN openlab, a unique public-private partnership, by signing a three-year agreement. Under the agreement, Micron will provide CERN with advanced next-generation memory solutions to further machine learning capabilities for high-energy physics experiments at the laboratory. Micron’s memory solutions that combine neural network capabilities will be tested in the data-acquisition systems of experiments at CERN.

Video: How Ai is helping Scientists with the Large Hadron Collider

In this video from SC18 in Dallas, Dr. Sofia Vallecorsa from CERN OpenLab describes how Ai is being used in design of experiments for the Large Hadron Collider. “An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learning. The CERN team demonstrated that AI-based models have the potential to act as orders-of-magnitude-faster replacements for computationally expensive tasks in simulation, while maintaining a remarkable level of accuracy.”

Argonne is Supercomputing Big Data from the Large Hadron Collider

Over at Argonne, Madeleine O’Keefe writes that the Lab is supporting CERN researchers working to interpret Big Data from the Large Hadron Collider (LHC), the world’s largest particle accelerator. The LHC is expected to output 50 petabytes of data this year alone, the equivalent to nearly 15 million high-definition movies—an amount so enormous that analyzing it all poses a serious challenge to researchers.

Generative Models for Application-Specific Fast Simulation of LHC Collision Events

Maurizio Pierini from CERN gave this talk at PASC18. “We investigate the possibility of using generative models (e.g., GANs and variational autoencoders) as analysis-specific data augmentation tools to increase the size of the simulation data used by the LHC experiments. With the LHC entering its high-luminosity phase in 2025, the projected computing resources will not be able to sustain the demand for simulated events. Generative models are already investigated as the mean to speed up the centralized simulation process.”

Training Generative Adversarial Models over Distributed Computing Systems

Gul Rukh Khattak from CERN gave this talk at PASC18. “We use a dataset composed of the energy deposition from electron, photons, charged and neutral hadrons in a fine grained digital calorimeter. The training of these models is quite computing intensive, even with the help of GPGPU, and we propose a method to train them over multiple nodes and GPGPU using a standard message passing interface. We report on the scalings of time-to-solution.”

Addressing Computing Challenges at CERN openlab

In this special guest feature from Scientific Computing World, Robert Roe speaks with Dr Maria Girone, Chief Technology Officer at CERN openlab ahead of her keynote presentation at ISC High Performance. “The challenge of creating the largest particle accelerator is now complete but there is another challenge – harnessing all of the data produced through experimentation. This will become even greater when the ‘high-luminosity’ LHC experiments begin in 2026.”

Agenda Posted for Mass Storage Conference in Santa Clara

The 34th International Conference on Massive Storage Systems and Technologies (MSST 2018) has posted their Speaker Agenda. The event takes place May 14-16 in Santa Clara, California. “Join the discussion on webscale IT, and the demand on storage systems from IoT, healthcare, scientific research, and the continuing stream of smart applications (apps) for mobile devices.”

Video: Computing Challenges at the Large Hadron Collider

CERN’s Maria Girona gave this talk at the HiPEAC 2018 conference in Manchester. “The Large Hadron Collider (LHC) is one of the largest and most complicated scientific apparata ever constructed. “In this keynote, I will discuss the challenges of capturing, storing and processing the large volumes of data generated at CERN. I will also discuss how these challenges will evolve towards the High-Luminosity Large Hadron Collider (HL-LHC), the upgrade programme scheduled to begin taking data in 2026 and to run into the 2030s, generating some 30 times more data than the LHC has currently produced.”

Video: HiPEAC Conference Looks at Innovations in Computer Architecture

This video features highlights from the first day at the HiPEAC conference, which took place last week in Manchester. HiPEAC is the premier European forum for experts in computer architecture, programming models, compilers and operating systems for embedded and general-purpose systems. “HiPEAC Coordinator Koen De Bosschere tells us what’s been going on, including a great keynote talk from Maria Girone at CERN. Meanwhile, Arm’s John Goodacre tells us why Manchester is a great location for the conference and why it’s important for companies like Arm to be involved in HiPEAC.”

Firing up a Continent with HPC

In this special guest feature from Scientific Computing World, Nox Moyake describes the process of entrenching and developing HPC in South Africa. “The CHPC currently has about 1,000 users; most are in academia and others in industry. The centre supports research from across a number of domains and participates in a number of grand international projects such at the CERN and the SKA projects.”