Sid Karin sounds off on the NSF supercomputing centers

Print Friendly, PDF & Email

In early December I pointed to an article that Larry Smarr wrote summarizing his take on the good and the bad in the NSF supercomputing program. Now another legendary NSF HPC center director, Sid Karin, has stepped up with his thoughts.

There is no such thing as an NSF (Supercomputer) Center and there never has been. There should be. What there are, in the words of Ed Hayes, then comptroller of NSF, are “NSF ASSISTED Supercomputer Centers.”

This is a double edged sword. The directors of the NSF centers have historically had considerably more latitude and agility in their decision making and in the operations of their organizations than the directors of their peer organizations, sponsored by other federal agencies have had. …

The other side of the coin is that NSF has neither provided sufficient funding nor has it provided any other kind of support when centers found themselves in one sort of difficulty or another. In my direct experience, and to my direct knowledge of activities at other centers, NSF funding has been inadequate to provide the direct support of what used to be called the base program. Each center has raised funds from industry partners, state governments, local universities, and foundations.

I was associated with these centers as a lowly student in their heyday, and reading the honest remarks of people who, at the time, were worlds removed from me in influence and renown is fascinating. It is also instructive, and reminds us that there is much work to do if we are to become more effective with our national technology investments. Over the past decade supercomputing in the US has lost its center of gravity. Despite repeated calls from the various PITAC/PCAST and other blue ribbon panels, the largely federally-funded supercomputing efforts in the United States are nothing more than fellow travelers, as related to one another as the guy in 7C was to you on that flight to Des Moines last month.

I’m not a fan of big government, and I don’t think a National Supercomputing Agency is the answer. But it does seem wise  to bring together all of the major investors in HPC in the government and get them to articulate a single, comprehensive, strategic plan. Perhaps different agencies and department would implement different parts of that plan. Certainly the plan would only provide rough guidelines with much room for innovation and rapid response within the framework. But some level of organization would be beneficial insofar as we need to be able to demonstrate we are making effective use of the taxpayer’s money. Or, hell, at least that we are thinking about how we are spending that money.

Comments

  1. We probably shouldn’t be thinking of HPC centers in isolation. We need to think of the whole infrastructure, including people, software, etc. NSF seems to be heading in the right direction with the ACCI task forces (https://nsf.sharepointspace.com/acci_public/default.aspx) and CF21 (http://www.nsf.gov/pubs/2010/nsf10015/nsf10015.pdf). It would be nice to see DOE’s science labs sign up for CF21, too.

  2. Dan – thanks for the links; I’ll check them out.