In this RCE Podcast, Brock Palen and Jeff Squyres speak with Jonathan Dursi about his recent editorial entitled HPC is dying, and MPI is killing it. The article that spawned a lot of attention in good discussion for our community.
The idea that the people at Google doing large-scale machine learning problems (which involves huge sparse matrices) are oblivious to scale and numerical performance is just delusional. The suggestion that the genomics community is a helpless lot who just don’t know any better and need to be guided back to the one true path is no less so. The reality is simpler; HPC is wedded to a nearly 25-year old technology stack which doesn’t meet the needs of those communities, and if we were being honest with ourselves is meeting fewer and fewer of the needs of even our traditional user base.
Jonathan Dursi has worked in large-scale technical computing for nearly 20 years. He has worked at the DOE ASCI Flash Centre at the University of Chicago, where he was part of the team that won a 2000 Gordon Bell Award; the Canadian Institute for Theoretical Astrophysics, where he collaborated Canadian astronomy community as the co-author of a long-range plan white paper, to design a decadal plan for computing in this data-intensive field; SciNet, Canada’s largest academic supercomputing centre; Compute Canada; and most recently at the Ontario Institute for Cancer Research, in the Department of Informatics and Bio-computing. He has taught classes in HPC and technical computing techniques in three countries, to students in many disciplines.
In related news, Jeff Squyres has posted his thoughts on this topic over at the Cisco Blog.
Sign up for our insideHPC Newsletter.