Patterson says erasing Tether's DARPA legacy vital to Nation's future IT leadership

Print Friendly, PDF & Email

UC Berkeley’s David Patterson argues at the Harvard Business Review today that Tony Tether and Bush 43 derailed the successful DARPA funding approach that brought us the Internet (among other things), and that fixing it is crucial for our Nation’s competitiveness in IT

Specifically, DARPA under Bush drastically reduced the role of universities in IT research projects it funded and shifted both power and money to companies. If the old DARPA model is not restored, the U.S. lead in IT — especially in software — could be lost.

…Tether instituted 12- to 18-month milestones for DARPA-funded programs. If you didn’t make them, he would cancel not just one person’s contract but the whole program. The idea that you can decide the success of research in 12 to 18 months is absurd.

Patterson’s case-in-point is the notable lack of progress we have made in software and technology to effectively take advantage of the now-ubiquitous parallel hardware

The result: Not much progress has been made in solving some of the biggest IT problems confronting us. One worth singling out in particular is developing technology so software can run on multi-core, or parallel, processors. Figuring out how you can make important programs go faster and how to add new features to it when you’re using 10 processors instead of one is a very hard problem to solve — the kind that if somebody in another country figures out how to solve it, the software center of the universe could move from the United States to someplace else.

Before Tether came in, a few of us successfully pitched a project to tackle that challenge. But during the Tether years, the vast majority of DARPA’s money for the project went to IBM, Sun, and Cray Research. I don’t know how many tens or hundreds of millions of dollars DARPA gave to these companies, but whatever research they did has had very little impact on solving one of the biggest problems facing computer science.

I think that Patterson is right; but the language of crisis he uses “U.S. lead in IT — especially in software — could be lost” is becoming worn. I’ve recently had occasion  to read most of the major blue ribbon HPC and IT reports that have been written in the US since the early 1980s, and almost every single one of them uses this language. Yes, IT is central to our Nation’s competitiveness, but have we really been on the hair edge of calamity for 30 years? The fact that all of these reports use the same language, and that we are still dragging it out today, starts to look like laziness on the part of our community. Rather than making a rich, reasoned argument, we just declare a crisis and cash a check.

On the other hand, perhaps this language is the only way to get the attention of our leaders. That in itself is a problem.


  1. […] and “staggering consequences” are not helpful in this context. As I’ve written before, it is a useful exercise to read the major blue ribbon HPC and IT reports that have been written […]


  1. John, I fundamentally agree with your assessment. We’ve been stuck on “IT” as the pinnacle of all technological barriers for the past thirty years. “Information Technology” has become to envelope such a myriad of industries and technologies that it has lot much of its meaning. Are we speaking about computing platforms, software paradigms, programming languages [which is fundamentally different from a software paradigm], networking [long haul and short distance] or storage? Wait, all of the above!? No single government agency is willing to fund projects in “IT.”

    David’s reference to the DARPA HPCS program is very strategic in its nature. DARPA is currently working on the follow on for the HPCS program, deemed the Ubiquitous High Performance Computing program, or UHPC. Indeed, Cray, Sun and IBM were big players in its predecessor. However, each prime contracting vendor had a myriad of national labs and university partners for the program. With the exception of IBM Deep Computing, *most* vendors employ engineers skilled in the craft of taking great ideas and turning them into supportable products. They don’t generally employ a large workforce of engineers doing pure R&D. Enter academia. In order for these DARPA programs to be wholly successful, the vendor primes *need* the wild thinking often found in universities and national labs.

  2. I have heard many companies describe the “risk outweighs the reward” attitude you have covered in this article – and they have chosen to avoid going down what should be a spirited, innovative path of discovery. This article rings so true – as what a number of us see as short-sighted, uninformed thinking with a lack of real understanding of what it takes to advance such critical computing technology.

    Thanks for raising this topic. We can only hope things will change for the better.

  3. Generally, I agree with your assessment, but this just does not fly.

    The article in HBR is so remarkably biased and opinionated, that it borders on inflamatory. No numbers. No data. It uses the tired common approach of marrying unpoplular political figures to the problem to rally people behind it. Academics have not lost DARPA funding to companies.

    Now it is well understood that DARPA has emphasized near-term “research” over more exotic, distant goals. However, we are also at a time where tremendous gains in our understanding of new science and our ability to engineer solutions have made rapid breakthroughs possible.

    Read over the current set of BAAs and see for yourself that the portfolio includes.

    The home runs DARPA hit were not understood as homeruns in the first few years, only in hindsight. Yet all of the criticism of Tether seems to neglect that fact.

    What surprises me most are the claims that DARPA dropped the ball on parallel computing, especially in light of multicore processors. Really? All of the readers should take time to review what DARPA was funding five years ago and then try to make the same claim.

    Just look at how fast industry is moving to address this parallel compute problem. Why should DARPA even bother paying for this now? The need is well understood and actively being pursued today. Now I am all for spending more in bring post-exascale work closer to today. This is the scale of project DARPA should fund. Not projects that address todays difficulty in programming parallel.

    This article was written to gain support for increased funding.

    I am disappointed.

  4. Well, that’s certainly one opinion – but not one that I believe is shared by a great many people in the community.