Maria Girone from CERN openlab to Keynote ISC 2018

Print Friendly, PDF & Email

Today ISC 2018 announced that Maria Girone from CERN openlab will keynote the conference on Monday, June 25. Her talk will focus on the demands of capturing, storing, and processing the large volumes of data generated by the LHC experiments.

I will discuss some of the approaches we are considering to grapple with these enormous data requirements, including deploying resources through using commercial clouds, and employing new techniques, such as alternative computing architectures, advanced data analytics, and deep learning,” explains Girone. “Finally, I will present some medical applications resulting from the research at CERN.”

The LHC is the world’s most powerful particle accelerator and is one of the largest and most complicated machines ever built. The LHC collides proton pairs 40 million times every second in each of four interaction points, where four particle detectors are hosted. This extremely high rate of collisions makes it possible to identify rare phenomenon and is vital in helping physicists reach the requisite level of statistical certainty to declare new discoveries, such as the Higgs boson in 2012. Extracting a signal from this huge background of collisions is one of the most significant challenges faced by the high-energy physics (HEP) community.

The HEP community has long been a driver in processing enormous scientific datasets and in managing the largest scale high-throughput computing centers. Together with many industry leaders in a range of technologies, including processing, storage, and networking, HEP researchers have developed one of the first scientific computing grids: a collaboration of more than 170 computing centers in 42 countries, spread across five continents. Today, the Worldwide LHC Computing Grid regularly operates 750 thousand processor cores and nearly half of an exabyte of disk storage.

Computing and storage demands will become even more pressing when CERN launches the next-generation “High-Luminosity” LHC in 2026. At that point, the total computing capacity required by the experiments is projected to be 50 to 100 times greater than today, with storage needs expected to be on the order of exabytes. Even assuming expected improvements on IT technologies, and given the realities of a constant budget, the current approach to data processing will not be sustainable. This is why an intense R&D program is on-going to explore alternative approaches to the High Luminosity LHC big data problem.

One area of medicine that can utilize CERN’s technologies and expertise is hadron therapy, a rapidly developing technique for tumor treatment. The next step in radiation therapy is the use of carbon and other ions. This type of therapy has some clear advantages over the use of protons in providing both local control of very aggressive tumors and lower toxicity, thus enhancing the quality of life during and after cancer treatment. In 2020 there will be around 100 centers around the world offering hadron therapy, and at least 30 will be located in Europe.

Maria Girone, was awarded her Ph.D. in high-energy physics in 1994. In 2002, Girone joined the IT Department as an applied scientist and CERN staff member. Two years later, she was appointed as section leader and service manager of the Oracle database services for the LHC experiments. In 2012, Girone became the founding chair of the WLCG Operations Coordination team, responsible for the core operations and commissioning of new services in the WLCG. In 2014, she was appointed the Computing Coordinator for the CMS Experiment at CERN for two years. As coordinator, Girone was responsible for 70 computing centers on five continents and more than 100 FTE of effort yearly to archive, simulate, process and serve petabytes of data. Later, Girone joined the management team of CERN openlab, taking over the position of CTO as of January 2016.

Check out our insideHPC Events Calendar