MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Building the Massive Simulation Sets Essential to Planck Results

Using NERSC supercomputers, Berkeley Lab scientists are generating thousands of simulations to analyze the flood of data from the Planck mission. As a project of the European Space Agency, the Planck satellite mission has been collecting trillions of observations of the sky since the summer of 2009.

The sheer volume of the Planck data, with about a trillion observations of a billion points on the sky, means that the techniques of exact analysis we used in the past for the data from balloon flights are no longer tractable,” says Julian Borrill of the Computational Research Division. “Instead we have to use approximate methods, and because they’re approximate, we have to worry about their possible uncertainties and biases. The only way to be sure of the Planck analysis is to compare it with a huge suite of Monte Carlo simulations, long known to be the most challenging aspect of the computation. In preparation, Borrill’s group at C3 has gradually built up their simulation capability for over a decade, tuning it to each new generation of NERSC supercomputers. The result is a suite of massively parallel codes running on NERSC’s 150,000-core Cray XE6 Hopper.

Read the Full Story.

Resource Links: