Over at TOP500.org, Bernd Mohr writes that Europe’s Human Brain Project will have a main production system located at the Juelich Supercomputing Centre. The HBP supercomputer will be built in stages, with an intermediate “pre-exascale” system on the order of 50 petaflops planned for the 2016-18 timeframe. Full brain simulations are expected to require exascale capabilities, which, according to most potential suppliers’ roadmaps, are likely to be available in, approximately 2021-22.
As well as providing sufficient computing performance, the HBP supercomputer will also need to support data-intensive interactive supercomputing and large-memory footprints. Besides the HBP’s main production system in Juelich, there will be a software development system at CSCS, Switzerland, a subcellular computing system at BSC, Spain, and a data analytics system at Cineca, Italy.
To give you a sense of the enormity of this task in terms of data size, a recently published data set of the digitized mouse brain with 1-mm resolution has a total amount of uncompressed volume data of 8 terabytes. The creation of a volume with similar spatial resolution for the human brain would result in around ~21,000 terabytes. The interactive exploration of such a data set is beyond the capacities of current computing. Thus, among other methodological problems, data processing becomes a major challenge for any project aiming at the reconstruction of a human brain at cellular resolution.