As the Large Hadron Collider restarts at Cern, data storage has become as important to scientists as compute power. But, as Tom Wilkie from Scientific Computing World reports, the innovative technologies being developed have much wider applications.
“The mathematics involved in simulating these events is very sophisticated because one has to solve the equations of Einstein’s general relativity and magneto-hydrodynamics all together. The problem also requires very advanced supercomputers running programs on tens of thousands of CPUs simultaneously, and the use of sophisticated techniques for data extraction and visualization. Petascale numerical simulation is therefore the only tool available to accurately model these systems.”
“The largest high-redshift cosmological simulation of galaxy formation ever has been recently completed by a group of astrophysicists (Drs. Feng, Di-Matteo, Croft, Bird, and Battaglia) from the U.S. and the U.K. This tour-de-force simulation was performed on the Blue Waters Cray XE/XK system at NCSA and employed 648,000 cores. They utilized approximately 700 billion particles (!) to represent dark matter and ordinary matter and to create virtual galaxies inside the supercomputer. The authors, who represent Carnegie Mellon University, UC Berkeley, Princeton University, and the University of Sussex, have given their simulation the moniker BlueTides.”
“SpaceX is designing a new, methane-fueled engine powerful enough to lift the equipment and personnel needed to colonize Mars. A vital aspect of this effort involves the creation of a multi-physics code to accurately model a running rocket engine. The scale and complexity of turbulent non-premixed combustion has so far made it impractical to simulate, even on today’s largest supercomputers. We present a novel approach using wavelets on GPUs, capable of capturing physics down to the finest turbulent scales.”