ISC Keynote: Tackling Tomorrow’s Computing Challenges Today at CERN

In this keynote video from ISC 2018, Physicist and CTO of CERN openlab discusses the demands of capturing, storing, and processing the large volumes of data generated by the LHC experiments.

CERN openlab is a unique public-private partnership between The European Organization for Nuclear Research (CERN) and some of the world`s leading ICT companies. It plays a leading role in helping CERN address the computing and storage challenges related to the Large Hadron Collider’s (LHC) upgrade program.

The LHC is the world’s most powerful particle accelerator and is one of the largest and most complicated machines ever built. The LHC collides proton pairs 40 million times every second in each of four interaction points, where four particle detectors are hosted. This extremely high rate of collisions makes it possible to identify rare phenomenon and is vital in helping physicists reach the requisite level of statistical certainty to declare new discoveries, such as the Higgs boson in 2012. Extracting a signal from this huge background of collisions is one of the most significant challenges faced by the high-energy physics (HEP) community.

Today, the Worldwide LHC Computing Grid regularly operates 750 thousand processor cores and nearly half of an exabyte of disk storage.

Computing and storage demands will become even more pressing when CERN launches the next-generation “High-Luminosity” LHC in 2026. At that point, the total computing capacity required by the experiments is projected to be 50 to 100 times greater than today, with storage needs expected to be on the order of exabytes. Even assuming expected improvements on IT technologies, and given the realities of a constant budget, the current approach to data processing will not be sustainable. This is why an intense R&D program is on-going to explore alternative approaches to the High Luminosity LHC big data problem.

One area of medicine that can utilize CERN’s technologies and expertise is hadron therapy, a rapidly developing technique for tumor treatment. The next step in radiation therapy is the use of carbon and other ions. This type of therapy has some clear advantages over the use of protons in providing both local control of very aggressive tumors and lower toxicity, thus enhancing the quality of life during and after cancer treatment. In 2020 there will be around 100 centers around the world offering hadron therapy, and at least 30 will be located in Europe.

Maria Girone was awarded her Ph.D. in high-energy physics in 1994. In 2002, Girone joined the IT Department as an applied scientist and CERN staff member. Two years later, she was appointed as section leader and service manager of the Oracle database services for the LHC experiments. In 2012, Girone became the founding chair of the WLCG Operations Coordination team, responsible for the core operations and commissioning of new services in the WLCG. In 2014, she was appointed the Computing Coordinator for the CMS Experiment at CERN for two years. As coordinator, Girone was responsible for 70 computing centers on five continents and more than 100 FTE of effort yearly to archive, simulate, process and serve petabytes of data. Later, Girone joined the management team of CERN openlab, taking over the position of CTO as of January 2016.

Check out our insideHPC Events Calendar