Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Generative Models for Application-Specific Fast Simulation of LHC Collision Events

Maurizio Pierini from CERN gave this talk at PASC18. “We investigate the possibility of using generative models (e.g., GANs and variational autoencoders) as analysis-specific data augmentation tools to increase the size of the simulation data used by the LHC experiments. With the LHC entering its high-luminosity phase in 2025, the projected computing resources will not be able to sustain the demand for simulated events. Generative models are already investigated as the mean to speed up the centralized simulation process.”

Training Generative Adversarial Models over Distributed Computing Systems

Gul Rukh Khattak from CERN gave this talk at PASC18. “We use a dataset composed of the energy deposition from electron, photons, charged and neutral hadrons in a fine grained digital calorimeter. The training of these models is quite computing intensive, even with the help of GPGPU, and we propose a method to train them over multiple nodes and GPGPU using a standard message passing interface. We report on the scalings of time-to-solution.”

Addressing Computing Challenges at CERN openlab

In this special guest feature from Scientific Computing World, Robert Roe speaks with Dr Maria Girone, Chief Technology Officer at CERN openlab ahead of her keynote presentation at ISC High Performance. “The challenge of creating the largest particle accelerator is now complete but there is another challenge – harnessing all of the data produced through experimentation. This will become even greater when the ‘high-luminosity’ LHC experiments begin in 2026.”

Agenda Posted for Mass Storage Conference in Santa Clara

The 34th International Conference on Massive Storage Systems and Technologies (MSST 2018) has posted their Speaker Agenda. The event takes place May 14-16 in Santa Clara, California. “Join the discussion on webscale IT, and the demand on storage systems from IoT, healthcare, scientific research, and the continuing stream of smart applications (apps) for mobile devices.”

Video: Computing Challenges at the Large Hadron Collider

CERN’s Maria Girona gave this talk at the HiPEAC 2018 conference in Manchester. “The Large Hadron Collider (LHC) is one of the largest and most complicated scientific apparata ever constructed. “In this keynote, I will discuss the challenges of capturing, storing and processing the large volumes of data generated at CERN. I will also discuss how these challenges will evolve towards the High-Luminosity Large Hadron Collider (HL-LHC), the upgrade programme scheduled to begin taking data in 2026 and to run into the 2030s, generating some 30 times more data than the LHC has currently produced.”

Video: HiPEAC Conference Looks at Innovations in Computer Architecture

This video features highlights from the first day at the HiPEAC conference, which took place last week in Manchester. HiPEAC is the premier European forum for experts in computer architecture, programming models, compilers and operating systems for embedded and general-purpose systems. “HiPEAC Coordinator Koen De Bosschere tells us what’s been going on, including a great keynote talk from Maria Girone at CERN. Meanwhile, Arm’s John Goodacre tells us why Manchester is a great location for the conference and why it’s important for companies like Arm to be involved in HiPEAC.”

Firing up a Continent with HPC

In this special guest feature from Scientific Computing World, Nox Moyake describes the process of entrenching and developing HPC in South Africa. “The CHPC currently has about 1,000 users; most are in academia and others in industry. The centre supports research from across a number of domains and participates in a number of grand international projects such at the CERN and the SKA projects.”

CERN openlab Joins Intel Modern Code Developer Challenge

James Reinders writes that a new Intel Modern Code Developer Challenge has teamed up with CERN openlab. “It is always an exciting time when I get to announce a Modern Code Developer Challenge from my friends at Intel, but it is even more special when I get to announce a collaboration with the brilliant minds at CERN. Beginning this month (July 2017), and running for nine weeks, five exceptional students participating in the CERN openlab Summer Student Programme are working to research and develop solutions for five modern-code-centered challenges.”

SKA and CERN Sign Big Data Agreement

“The signature of this collaboration agreement between two of the largest producers of science data on the planet shows that we are really entering a new era of science worldwide”, said Prof. Philip Diamond, SKA Director-General. “Both CERN and SKA are and will be pushing the limits of what is possible technologically, and by working together and with industry, we are ensuring that we are ready to make the most of this upcoming data and computing surge.”

E4 Engineering Powers CERN with Quanta Cloud Technology

Over the last two years, E4 Computer Engineering and QCT have worked in close collaboration to supply CERN with thousands of server systems, cores and Petabytes of storage. “E4 has successfully supplied us with reliable and performant servers of the QCT brand over the last couple of years,” said Olof Bärring, Deputy Head of the computing facilities group, IT department, CERN. “The systems have proved to be suitable for different purposes ranging from High Throughput Computing (HTC) number crunching of physics data coming from LHC experiments to High Performance Computing (HPC) clusters for the CERN theory group. We are also very pleased with the reliable deliveries, the warranty support provided.”