In this video, Marcel Vreeswijk and Hurng-Chun Lee from the NIKHEF National Institute for Subatomic Physics explain how customized grid computing workflows are key to filtering LHC datasets down to a manageable size.
The Large Hadron Collider (LHC) is the world’s largest and most complex experiment, at the cutting edge of High Energy Physics. Particle physicists use the LHC to study variations from the Standard Model and discover potential new laws of physics. The particle known as the top quark is a window to this weird and wonderful world. The LHC produces enormous amounts of data, enough to fill piles of DVDs. Without these tools, it would be impossible to pick out the collision event that could hold the clues to top quark behaviour.
Check out more Tales from the Grid on YouTube.