European HPC Strategy Gathers Funding and Momentum

Print Friendly, PDF & Email

Tripartite cooperation between industry, academia, and the European Commission is on course to help Europe achieve leadership in high-performance computing, as Tom Wilkie from Scientific Computing World learned at PRACEdays14 in Barcelona.

TomWilkie_150pxCan Europe become a powerhouse of HPC technology? It is looking increasingly likely, judging by presentations at the PRACEdays meeting, held in Barcelona from 20 to 22 May.

Certainly, there are glittering prizes at stake. According to Jean-Francois Lavignon, of French supercomputer manufacturer Bull, the HPC market is worth around $10bn annually, in terms of the computers alone, and the figure can be doubled at least when the complete value chain is taken into account. Moreover, he said, HPC is a dynamic market with a significant growth rate.

Lavignon also leads the ETP4HPC – the European Technology Platform for High-Performance Computing, a group of vendors and manufacturers of computer systems and components who hope to spearhead Europe’s bid for leadership in the field. He pointed out that the ETP4HPC had successfully launched a Public-Private Partnership with the European Commission, the first fruits of which would be to assess bids for a set of European Centres of Excellence in HPC, to be set up early next year.

The commitment of public funds is creating momentum behind a project that, as reported here almost exactly a year ago, was just starting to come together in 2013. The PRACEdays announcements also provide an interesting prelude to the International Supercomputing Conference (ISC’14), which opens in Leipzig, Germany, at the end of June. On Sunday 22 June, in advance of the main event itself, there will be a full-day ‘Workshop on International Cooperation for Extreme-Scale Computing’.

The European model, Leonardo Flores Anover, from the European Commission, told the PRACEdays meeting will involve an initial tranche of between 4 and 5 million euros for each of the new Centres of Excellence. He expected that between eight and 10 centres would be created in the first wave. The final deadline for bids is 14 January 2015. But in a clear sign of the Commission’s commitment to the initiative, its understanding of the complex issues involved, and hence of the need for flexibility, he frankly acknowledged that the initial tranche of funding will not be enough to create critical mass from day 1. There will be further calls and further funding, he said.

In harnessing the talents and resources of the public and private sectors to develop European leadership in HPC, he said, Europe was essentially starting from scratch. The first wave of centres of excellence “will show us how to mobilise as much funding from private sources to match public funds,” he said. The first wave will be risky, he went on: “we don’t know who will commit.”

Thanks to the cross-European access to Tier 0 supercomputing resources already pioneered through Prace, some of the centres of excellence may initially not need to have hardware locally. But he expected that, in the follow-up, the successful centres of excellence can expect to have more resources.

Catherine Riviere, the outgoing Chair of the Prace Council, reminded the meeting that Prace has been providing a European HPC Research Infrastructure since 2010. It has six Tier 0 supercomputers at its disposal, in France, Spain, Germany and Italy, allowing it to provide 15 Petaflop/s in 2014 to the European scientific community through a peer-reviewed award system. Since 2010, it has awarded 8bn core hours, but the requests for time on its infrastructure are about double the resources available.

The PRACEdays meeting reviewed several of the projects that had been awarded core hours. One was an application in biomedicine where supercomputers were used to simulate the human respiratory system which, it is hoped, will lead to better surgical procedures. Prace also awards time to small and medium enterprises, in order to widen the uptake of HPC techniques and enhance its contribution to European economic advancement, as described here.

The seriousness with which the European Commission is taking high-performance computing can be gauged from the fact that it provided two speakers on the topic at the PRACEdays meeting. Augusto Burgueno Arjona, head of the e-infrastructure unit at the Directorate General Connect, stressed that HPC was a tool to address societal challenges and not just an academic exercise. But he also talked about the Commission’s aim of building a European value chain in HPC that was globally competitive. About 93 million euros would be spent in 2014-2015 on developing exascale technologies, he continued.

Jean-Francois Lavignon, on behalf of the ETP4HPC, saw HPC as a strategic technology for industry and commerce, both in Europe and worldwide. Traditional users of HPC included the automotive and aerospace industries but, he pointed out, if research and development into exascale technologies led to “cheap flops” then more users could be brought in from the life sciences, entertainment, and other non-traditional HPC users.

But he identified at least four challenges on the road to exascale. The first was that the number of cores would rise exponentially, and so with the increasing number of cores, parallelism was inevitable and that had implications for useable application software. Power consumption was the second challenge – how to decrease the amount of energy that each core uses as the number of cores rises. He also identified data management, not just numerical processing, as a challenge to be overcome. Some applications, ranging from the LHC at Cern, to the simulation of combustion — for leaner, more fuel efficient engines – may generate exabytes of raw data which it will be impossible to store, so the data would have to be visualised in the loop.

The final challenge was that of resilience, he said. The increase in the number of nodes and the increase in simulation time of applications created a need for redundancy and reliability. But if the solution to the reliability problem was redundancy – i.e. more components – then that would inevitably increase the power budget.

The workshop to be held at the ISC’14 starts from the position that the solutions to the challenges of many-core and accelerator technologies will require international cooperation and hopes to extend the range of cooperation as well as providing a detailed vision of the worldwide directions towards extreme-scale computing systems, software support, and perhaps most important strategic applications as they relate to national needs.

But Lavignon’s theme that one of the main points of research into exascale technologies was to lead to ‘cheap flops’, not necessarily a full-blown exascale machine, was endorsed by Augusto Burgueno Arjona from the European Commission. He told the PRACEdays14 meeting that, in the European Commission’s strategy, the development of an exascale machine was not necessarily the goal but rather, that there was value in the process itself: developing the know-how for the next generation of computers and applications would diffuse the technology and expertise out to a wider set of users who would be able to capitalise on it and thus expand the European economy.

This story appears here as part of a cross-publishing agreement with Scientific Computing World.