Sign up for our newsletter and get the latest big data news and analysis.
Daily
Weekly

Supercomputing LHC Experiments with Titan

University of Texas at Arlington physicists are preparing the Titan supercomputer at Oak Ridge Leadership Computing Facility in Tennessee to support the analysis of data generated from the quadrillions of proton collisions expected during this season’s Large Hadron Collider particle physics experiments.

Why the HPC Industry will Converge on Europe at ISC 2016

In this special guest feature from Scientific Computing World, ISC’s Nages Sieslack highlights a convergence of technologies around HPC, a focus of the ISC High Performance conference, which takes place June 19-23 in Frankfurt. “In addition to the theme of convergent HPC technologies, this year’s conference will also offer two days of sessions in the industry track, specially designed to meet the interests of commercial users. Our focus is Industrie 4.0, a German strategic initiative conceived to take a leading role in pioneering industrial IT, which is currently revolutionizing engineering in the manufacturing sector.”

Next Generation SDN Driven Systems for Exascale Data Intensive Science

Harvey Newman from CalTech presented this talk at the Mellanox booth at SC15. “We describe activities of the Caltech High Energy Physics team and collaborators, related to the use Software Defined Networking to help achieve fast and efficient data distribution and access. Results from Supercomputing 2014 are presented together with our work on the Advanced Network Services for the Experiments project, and a new project developing a Next Generation Integrated SDN Architecture, as well as our plans for Supercomputing 2015.”

Video: High Performance Computing for the LHC

In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that make the LHC possible. “The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter.”

High Throughput Data Acquisition at the CMS experiment at CERN

“The CMS detector at the Large Hadron Collider at CERN underwent a replacement of its data acquisition network to be able to process the increased data rate expected in the coming years. We will present the architecture of the system and discuss the design of its layers which are based on Infiniband as well as 10 and 40 GBit/s Ethernet.”