Do We Underestimate the Real Challenge of Exascale?

Print Friendly, PDF & Email

Over at the ISC Blog, Mark Parsons from the EPCC supercomputing centre writes that scalable software is the real Grand Challenge of Exascale.

I believe that the problems that we’ve seen at the Petascale with regard to the scaling of many codes are insurmountable if we take the incremental change approach at the Exascale. Looking at the CRESTA codes, it is highly unlikely any of them will scale to the Exascale, even allowing for weak scaling (through increased resolution of the model under study) using incremental improvements. This means we need to think about disruptive changes to codes in order to meet the challenge.

Parsons leads the CRESTA FP7 project, which is focussing its work on a small set of six HPC applications that are widely used today and represent the sort of codes that will have to run on Exascale systems. He says that over the past 20 years, the community has managed to cope with each new generation of hardware through incrementally improving our codes. But today, simply changing a solver or some other disruptive change to an existing code will not be enough.

We simply do not understand how to compute using one billion parallel threads (except perhaps in trivial cases). It requires us to completely rethink how we simulate our physical world using this much parallelism. The problem goes to the foundations of modern modelling and simulation – we need to think beyond the tools we have today and invent new methods to express the mathematical descriptions of the physical world around us, on these and even larger systems in the future. Only by doing this will we move modelling and simulation forward for the next 20 years. This is the real challenge we face at the Exascale.

Read the Full Story.