Why Governments and HPC Need Each Other

Print Friendly, PDF & Email
Robert Roe

Robert Roe

In this special guest feature from Scientific Computing World, Robert Roe writes that government has a critical role to play in the ongoing development of high performance computing.

The US needs to invest in high-performance computing (HPC) so that its industry can continue to stay competitive in global marketplaces, according to a report from the market consultancy company Intersect 360, entitled “The Exascale Effect: the Benefits of Supercomputing Investment for U.S. Industry.”

solveThe work was carried out on behalf of the US Council on Competitiveness, a non-partisan, non-governmental organization, composed of corporate CEOs, university presidents, labour leaders and national laboratory directors. Its aim is to ‘set an action agenda to drive US competitiveness, while generating innovative public policy solutions for a more prosperous America’.

The new analysis comes just two months after a Task Force on High Performance Computing reported to the US Department of Energy that data-centric computing, in conjunction with Exascale, will be one of the most important requirements of HPC within the next 10 years.

Intersect’s report states: “High performance computing (HPC) is inextricably linked to innovation, fueling breakthroughs in science, engineering, and business. HPC is viewed as a cost-effective tool for speeding up the R&D process, and two-thirds of all US-based companies that use HPC say that “increasing performance of computational models is a matter of competitive survival.”

The report identifies HPC as a key tool for the US to stay competitive in many industries, not just on an academic and government research level. It is based on the research conducted by Intersect360, consisting of 14 in-depth interviews with representatives of industrial HPC organizations and 101 responses to a comprehensive online survey of US-based companies that use HPC.

Among the key findings, was the statement that: “US industry representatives are confident that their organizations could consume up to 1,000-fold increases in computing capability and capacity in a relatively short amount of time.”

Increasing the computational power of today’s petaflop systems, by several orders of magnitude, may be the easiest route to added value for industry, but there are several technology roadblocks that prevent the current technology from being scaled as HPC users do today.

Software scalability, or the ability to paralelize code across potentially tens of thousands of cores is ‘the most significant limiting factor in achieving the next 10x improvements in performance, and it remains one of the most significant factors in reaching 1,000x.’

This is a complex task, which can take several months on the petaflop systems of today. It was reported that a simulation code called ‘Alya’ was scaled to over 100,000 cores earlier this year on the system housed at the Barcelona Supercomputing Center (BSC). The porting and optimization of Alya took several months but the researchers were able to achieve ‘more than 85 per cent parallel efficiency’ scaling the code to run on the blue waters, a system based on a CrayXE6.

Whether it is for increasing the complexity of models for simulation-based industries or increasing the throughput of data, for the life sciences or a finance-centered business, software must be scaled effectively to make use of the cores as efficiently as possible.

But significant investment must be found if these technology barriers are to be overcome. Traditionally the largest investment in HPC has come from government research grants, for example from the US national labs. The report goes on to recommend that the ties between industry and the US government be strengthened to expedite the transition to exascale computing.

The report states that the in-depth qualitative answers indicated: ‘There is more work that can and should be done in order to provide more direct benefit to industry from government investment in scalability and expertise.’

Furthermore, 56 per cent of respondents to the Intersect 360 survey agreed that ‘the work done by national government research organizations can “act as major driver for advancing HPC technology, leading to products and software that we will use in the future.”

The report adds to the growing chorus of voices promoting the benefits of HPC, and eventually exascale computing, as necessary to many different industries and calling for government investment to facilitate the process. Announcements of the next generation Coral systems are expected later this year, possibly at the same time as the SC14 Supercomputing show in New Orleans in November. What further policy responses will be forthcoming remains to be seen.

Tom Wilkie reviewed government-driven development of exascale computing after the close of ISC, held in Leipzig at the end of June this year.

This story appears here as part of a cross-publishing agreement with Scientific Computing World.

Sign up for our insideHPC Newsletter.