Argonne is Supercomputing Big Data from the Large Hadron Collider

Over at Argonne, Madeleine O’Keefe writes that the Lab is supporting CERN researchers working to interpret Big Data from the Large Hadron Collider (LHC), the world’s largest particle accelerator. The LHC is expected to output 50 petabytes of data this year alone, the equivalent to nearly 15 million high-definition movies—an amount so enormous that analyzing it all poses a serious challenge to researchers.

Generative Models for Application-Specific Fast Simulation of LHC Collision Events

Maurizio Pierini from CERN gave this talk at PASC18. “We investigate the possibility of using generative models (e.g., GANs and variational autoencoders) as analysis-specific data augmentation tools to increase the size of the simulation data used by the LHC experiments. With the LHC entering its high-luminosity phase in 2025, the projected computing resources will not be able to sustain the demand for simulated events. Generative models are already investigated as the mean to speed up the centralized simulation process.”

Training Generative Adversarial Models over Distributed Computing Systems

Gul Rukh Khattak from CERN gave this talk at PASC18. “We use a dataset composed of the energy deposition from electron, photons, charged and neutral hadrons in a fine grained digital calorimeter. The training of these models is quite computing intensive, even with the help of GPGPU, and we propose a method to train them over multiple nodes and GPGPU using a standard message passing interface. We report on the scalings of time-to-solution.”

Addressing Computing Challenges at CERN openlab

In this special guest feature from Scientific Computing World, Robert Roe speaks with Dr Maria Girone, Chief Technology Officer at CERN openlab ahead of her keynote presentation at ISC High Performance. “The challenge of creating the largest particle accelerator is now complete but there is another challenge – harnessing all of the data produced through experimentation. This will become even greater when the ‘high-luminosity’ LHC experiments begin in 2026.”

Agenda Posted for Mass Storage Conference in Santa Clara

The 34th International Conference on Massive Storage Systems and Technologies (MSST 2018) has posted their Speaker Agenda. The event takes place May 14-16 in Santa Clara, California. “Join the discussion on webscale IT, and the demand on storage systems from IoT, healthcare, scientific research, and the continuing stream of smart applications (apps) for mobile devices.”

Video: Computing Challenges at the Large Hadron Collider

CERN’s Maria Girona gave this talk at the HiPEAC 2018 conference in Manchester. “The Large Hadron Collider (LHC) is one of the largest and most complicated scientific apparata ever constructed. “In this keynote, I will discuss the challenges of capturing, storing and processing the large volumes of data generated at CERN. I will also discuss how these challenges will evolve towards the High-Luminosity Large Hadron Collider (HL-LHC), the upgrade programme scheduled to begin taking data in 2026 and to run into the 2030s, generating some 30 times more data than the LHC has currently produced.”

Video: HiPEAC Conference Looks at Innovations in Computer Architecture

This video features highlights from the first day at the HiPEAC conference, which took place last week in Manchester. HiPEAC is the premier European forum for experts in computer architecture, programming models, compilers and operating systems for embedded and general-purpose systems. “HiPEAC Coordinator Koen De Bosschere tells us what’s been going on, including a great keynote talk from Maria Girone at CERN. Meanwhile, Arm’s John Goodacre tells us why Manchester is a great location for the conference and why it’s important for companies like Arm to be involved in HiPEAC.”

Firing up a Continent with HPC

In this special guest feature from Scientific Computing World, Nox Moyake describes the process of entrenching and developing HPC in South Africa. “The CHPC currently has about 1,000 users; most are in academia and others in industry. The centre supports research from across a number of domains and participates in a number of grand international projects such at the CERN and the SKA projects.”

CERN openlab Joins Intel Modern Code Developer Challenge

James Reinders writes that a new Intel Modern Code Developer Challenge has teamed up with CERN openlab. “It is always an exciting time when I get to announce a Modern Code Developer Challenge from my friends at Intel, but it is even more special when I get to announce a collaboration with the brilliant minds at CERN. Beginning this month (July 2017), and running for nine weeks, five exceptional students participating in the CERN openlab Summer Student Programme are working to research and develop solutions for five modern-code-centered challenges.”

SKA and CERN Sign Big Data Agreement

“The signature of this collaboration agreement between two of the largest producers of science data on the planet shows that we are really entering a new era of science worldwide”, said Prof. Philip Diamond, SKA Director-General. “Both CERN and SKA are and will be pushing the limits of what is possible technologically, and by working together and with industry, we are ensuring that we are ready to make the most of this upcoming data and computing surge.”