Chronicle article paints supercomputing codes as "rickety," misses the mark

Print Friendly, PDF & Email

The Coalition for Academic Scientific Computation just held its annual meeting Monday and Tuesday in DC, followed by a 20th anniversary celebration on Wednesday at which the Chronicle of Higher Education was evidently present sniffing around for story ideas.

They ended up filing this article on Wednesday, which opens as follows

Supercomputers keep breaking records for processing speed, but software to operate them has not kept up with that increasingly zippy hardware. The often-rickety supercomputing computer code is becoming an obstacle to making better weather models, medical simulations, and other applications of high-performance computers, said experts at a conference here Wednesday on the future of academic supercomputing.

Let’s start with the ideas here. Reading further into the article you find quotes from people who know what they’re talking about

“Codes are still being used from the 1960s,” said Ed Seidel, director of the National Science Foundation’s office of cyberinfrastructure, in an interview at the meeting. “Those have to be retooled or rethought” to take full advantage of the latest supercomputers, he said.

This is correct. Just as a significant software investment was required when we shifted from vector to commodity processors in the 90s, we are again at a point where some retooling will be required to deal effectively with multicore processors. On this day in history we aren’t experiencing a crisis with the current crop of 4- and 6-core processors, but it’s easy to see us being in full crisis mode within the next 5 years. We clearly need to have already started this process as a community, which we haven’t. It would also be swell if we could do a little future proofing by developing some effective middleware to mask some of this without losing (much) performance.

And I also understand that the way the NSF gets political backing, and potentially gets additional funding, to support this transformation is by motivating members of Congress to move money in their direction. Crises motivate action, and so perhaps some good comes out of the Chronicle’s article.

But if it does, it does so dishonestly. The tone of the Chronicle’s article is not only wrong, its unhelpful in the big scheme of things.

First, the article paints the entire HPC community code base with the same big brush. There are domains in which the applications are relatively modern, helped along by innovative software teams and forward-thinking program managers. And in all cases the codes being used today are still useful and, in some instances, represent the only way to do any computation in a given domain. The alternative is simply nothing in many cases.

In this vein the article calls out the use of Fortran as an example of just how dire the situation in modern scientific computing really is

Attendees at the meeting said one of the most popular computer languages used to create programs for supercomputers is Fortran, which went out of style among conventional programmers decades ago and is rarely even taught in college computer-science departments today. It’s as if your new laptop still ran MS-DOS, the operating system that predated Windows on personal computers.

I’m not sure what would have been a more stylish choice — Java, perhaps? Or C#? Nor am I sure what the point is of making a statement like that.

Our community’s use of Fortran is not a matter of style, and its also not (usually) just a matter of funding. I have been part of several new large scale software development products in recent years that chose Fortran precisely because the language offers features that allows it to perform very well on HPC architectures. In the places where we’ve used Fortran its been because it was the best choice for the job. There have also been projects where it wasn’t a good choice, and we went with C. A choice of language is based on the merits relative to the problem being solved. Not on how old it is, or whether it is what all the cool kids are using today.

Another real problem with the article is that it calls into question the reliability and usefulness of the results our community provides the research community. Phrases like “homemade software…written decades ago” and “often-rickety supercomputing computer code” don’t inspire confidence and, even worse, they undermine the value of supercomputing to the research establishment. Why should the nation continue to invest in any hardware at all when the best we can do is to run “rickety” software on it? Indeed, perhaps the community itself isn’t to be trusted since the best it can do is to produce “homemade software?”

Not that I’m at all clear with what the alternative to homemade code is. Research codes are — wait for it — research. And research projects are all “homemade.” Like the first iPhone prototype, or the first hybrid car, the first high efficiency solar cell, and the first telephone.

I don’t fault the NSF or Seidel for any of this. Indeed, they seem to be taking some action in the right direction that I would be proud to be a part of

Mr. Seidel said the National Science Foundation recently set up a committee to look into the supercomputing-software issue and make recommendations for how to fix it. Those recommendations are expected within the next year, he said.

The article itself, however, misses the mark. In their defense I acknowledge that it isn’t part of their mission to be a booster for HPC. But it is part of mine.

Trackbacks

  1. […] Chronicle article paints supercomputing codes as “rickety,” misses the mark […]

Comments

  1. I encourage anyone in HPC who reads this article to pass it along to your colleagues. The article in the Chronicle has drawn conclusions and used much of this information out of context – as all of us in HPC understand – their reporting staff clearly didn’t take a deep enough look at what they were writing about.