We lost the hardware…what if we lost the software too?

Print Friendly, PDF & Email

At last week’s Newport HPCC conference I was on a panel that was asked to opine about what disruptive technologies we saw on the horizon. Andy Jones at NAG was in the audience, and has written briefly about the panel.

We were asked specifically about GPUs, and discussed all the usual angles. Andy sums up my contribution to the panel discussion around software pretty well

One suggestion, made by John West…, was that the next disruptive technology could be in software, especially programming tools and interfaces. This builds on the fact that parallel computing is no longer a specialist activity unique to the HPC crowd – parallel processors are becoming pervasive across all areas of computing from embedded to personal to workgroup technical computing. Parallel programming is thus heading towards a mass market activity – and the mass market is unlikely to view what we have in HPC currently (Fortran plus MPI and/or OpenMP, or limited tools, etc) with much favour. I’m not knocking any of these, but they are not mass-market interfaces to parallel computing. So perhaps the mass market, through volume of people in need – and companies driven by economics will come up with a “better” solution for interfacing with supercomputers.

As a HPC community we lost control of much of our hardware to the commodity market some years ago. Maybe we now face losing control of our software to the commodity community too.

(Emphasis Andy’s, not mine.) We were asked to talk about inflection points, but what I pointed out was actually a reflection point.

Although we lost control over our processors a long time ago, until recently we have been relatively unchallenged as owners of the requirements driving parallelism — both in hardware and software. But as parallel hardware continues to make inroads into people’s pocket devices and onto their desktops, new software will provide new capabilities that users may start to clamor for. This change in turn would revitalize the hardware market as users demand the new capabilities, which will draw more developers into parallel code development, growing that population from O(10,000) focused on HPC historically, to O(1,000,000) focused on everything but HPC.

One great thing will come out of this: the HPC community will have access to a more diverse set of tools. But along with it will come a new reality in which HPC, and its requirements and desires, will be a several-orders-of-magnitude minority in the parallel space. Whether we can make that good or not remains to be seen.


  1. […] software as the next big disruption after GPU Computing. John adds to some of those thoughts in his own post, essentially saying that as the general availability of parallel-capable hardware increases, the […]

  2. […] lost the hardware…what if we lost the software […]


  1. John Leidel says

    There was recently a Linkedin thread that elicited comments from various group members on the disruptive nature of HPC software [read programming] technologies. The debate included quite the smorgasbord of opinions from the full spectrum of the HPC ecosystem. However, the general consensus seemed to be that we need to be cognizant of the efficiency [as opposed to focusing solely on the functionality] of any new programming model that is HPC-centric.