Video: Data Processing at the LHC

httpv://www.youtube.com/watch?v=sfyHdFo5GWE

In this Google Tech Talk video, Bob Jones, CERN’s IT EU Project Leader for the CERN IT Department, presents on the immense data processing challenges of the Large Hadron Collider.

ABSTRACT: CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. CERN has recently completed the construction of the LHC, the world’s largest and most powerful particle accelerator. Thousands of scientists around the world have contributed to constructing the sophisticated LHC experiments and they are now eagerly waiting to get their hands on more data to extract the physics during the next fifteen years, the expected lifetime of the LHC. To reach this goal, tens of thousands of computers distributed worldwide are being harnessed in a distributed computing network called the Worldwide LHC Computing Grid (WLCG). This supports the offline computing needs of the LHC experiments, connecting and combining the IT power of more than 150 computer centres in more than 30 countries. The rapid increase in performance of the LHC accelerator is having an impact on the computing requirements since it increases the rate, complexity and quantity of data that the LHC experiments need to store, distribute and process. The previous estimates of 15 Petabytes per year of stored data are already looking conservative and so it is necessary to plan to go well beyond this figure. This presentation will give an overview of CERN, its IT department, WLCG and how we expect the computing environment for the LHC to evolve in the future.

For more information on the LHC worldwide network, check out my podcast interview with Jean-Michel Jouanigot, Communication Systems Group Leader at CERN.