So You Want To Get To Exascale? Sure, follow me. I’ll Show You Where It’s At.

Print Friendly, PDF & Email

If the U.S. is serious about having exascale-level computation available by the end of this decade, its current model of disjointed, incremental funding of multiple, small research projects is just not going to be sufficient.

From conferences to workshops, I’m sure many of you have heard this same response – and some of you may even be repeating this mantra:

“Will we reach exascale by 2018?” Sure we can do it. We’re the U.S.A. We always rise to the challenge!”

Seriously? Are U.S. Policy Makers and Funding Agencies Drinking Their Own Bathwater?

And just how are we going to get there? Unfortunately, a number of people in government see exascale as just a very large procurement. Lots of computers requiring lots of power.

But the truth is, commercial computer manufacturers have absolutely no incentive to take on this level of research. Adapting the computing technology used for commercial and consumer purposes, taking a COTS approach, is clearly not the answer. We are going to need something new and unique. And the technology roadmaps that draw a line to exascale don’t provide the answers – they are all laced with smoke and mirrors.

The U.S. HPC community waits with anticipation to see what direction it should take – as determined primarily by the U.S. Department of Energy. And a driving force in HPC research and development of the past 20 years, DARPA, a key component in DOE’s ability to connect the dots, seems to be losing the backing and support it needs to fund critical research programs as the future of the UHPC program funding is in question.

Is an entire community following the Pied Piper into the enchanted forest – foolishly believing all those presentations that say the country will have exascale by 2018?

In this special edition of The Exascale Report, also available as an audio podcast, we’re going to address a rather sensitive topic – but it’s one that needs to be discussed. Let’s call this a much-needed Exascale Reality Check.

And while we try to keep an international perspective to The Exascale Report – this special issue is focused on the U.S. But – I assure you – the discussion is pertinent to all countries.

Here’s the situation: In just the past three months, I have seen far too many presentations stating – as though it is fact and not to be questioned – that we will reach exascale by 2018. Many people believe this is a done deal.

I must admit I’m confused by some of the people who nonchalantly repeat this wishful thinking ––and actually believe it.

It begs me to ask this question – again: Are U.S. Policy Makers and Funding Agencies Drinking Their Own Bathwater?

So, I wonder, is it arrogance? Or is it just a huge misunderstanding of the challenge we face, when senior industry thought leaders and opinion shapers say, “Sure we can do it. We’re the U.S.A. We always rise to the challenge!”

I think these statements, and the artificial exascale milestone of 2018 that thousands now believe is gospel, go unchallenged because there is another problem that needs to be addressed. The major computing manufacturers and their key partners – those organizations that ultimately will have to bring the necessary technology to market to enable exascale – are simply reluctant to speak up and challenge the funding agencies – for fear of retribution – or falling out of favor.

According to a number of my sources, industry is not incentivized to take risks. The industry players are expected to show blind faith in supporting funding agency requirements, and they are definitely not expected to stand up and challenge the accuracy or roadmaps and plans created by agency-funded working groups. Private industry is forced to go along with technology investigation – even if it means going down a path that they know leads to nowhere.

However the U.S. got into this mode – we still have to ask the same question: Why do we have so many intelligent, industry technologists and corporate leaders seemingly putting blind faith into what really – at this point – is no more than wishful thinking? I know I’m not alone in my opinion – it appears the Pied Piper is alive and well in leading the U.S. HPC community.

This article is not meant to be a political statement. And I hope this will inspire some fresh thinking – and better understanding from both the government and industry side of this equation. But let’s call for a reality check now – before any more time –or money – is wasted. Before we go any farther into the enchanted forest.

I fully understand the importance of positive thinking – and believing we can accomplish exascale in a certain forecasted timeframe. But I also believe in the critical importance of honest reality checks – especially when the result is something we may not want to hear.

Regarding exascale, quite frankly, I’m tired of hearing “We can do it if we put our minds to it!”

I’m sure Dr. Norman Vincent Peale would be proud to hear this sentiment being used so often, but putting our minds to it is only half the equation. Who is going to pay for this brain trust – and the many interim research programs and development platforms needed for the incremental Proofs of Concept that will be required?

If “putting our minds to it” also refers to finding the funds – well, great. But when are we going to do that? Because now is the time.

Achieving exascale is a long-term investment. And I stand very firmly on the side of the table that believes it is one of the most important investments any country can make, let alone the U.S. But speaking directly to the U.S. policy makers and HPC community, I have to be blunt and say I just don’t see the evidence that the U.S. funding agencies understand, or care about the importance of a long-term investment in exascale – and I certainly don’t see the funding commitment necessary to get us there. Even the future of the UHPC program – a critical source of much-needed research – is questionable. If continuation of the UHPC program funding is in jeopardy, what does that say about our commitment to HPC? And to exascale? And where then do we find the funding that supports ongoing HPC architectural research and innovation – at the level we need to build exascale-class systems?

There is no doubt that a tremendous amount of research is needed, in primarily three areas:

  • Power Reduction
  • Resiliency
  • Software Development

Now it seems like this type of research should be the mission of the UHPC program. And it is. But the funding is for short-term research and discovery. That’s part of the problem. This isn’t a one-time, short-term investment. Research leads to more research – and refinement – and these three areas will prove to be the foundation of actually building practical exascale systems. DARPA seems to be losing support – from conversations I’ve had with key people in the community – and ongoing funding for the UHPC program appears to be in jeopardy. This would be a huge setback to any exascale development efforts – and in my opinion – would pretty much kill any chance the U.S. has of achieving exascale-level computation by the end of the decade. The UHPC program may not be perfect, but it’s so very important. It’s the best we have – and it needs and deserves much more support.

Along with many of my colleagues, I’m discouraged by senior government officials who think they don’t have to do anything – other than procure a very large system. This is naïve. As stated previously, there is absolutely no incentive for commercial computing companies to take on this level of research – or to develop the technology or systems at this scale. The computer industry is not going to solve this problem without significant government funding to drive the research agenda. There has never been a stronger need for collaboration – and involvement.

But maybe there’s a bigger issue here. Maybe we just can’t overcome the politics involved to create a unified plan. How do we get consensus on an investment that will take 15-20 years to show an ROI? Quite possibly the most impressive ROI of any technology ever funded – but still – we’re talking about research and development that will span 4-5 political elections. Can this country take on such a challenge?
We need to think big. We are all happy to “talk the talk” – but now it’s time to “walk the walk.”

The U.S. continues to struggle with economic challenges, so it’s no wonder we have all become such short-term thinkers. Business live from quarter to quarter, and many Americans live week to week. There is no overnight fix, but if we can achieve exascale and apply that level of computation to a broad range of challenges, we have a tremendous opportunity to change some of the most pressing economic problems this nation has ever faced. Exascale-level computation – with modeling and simulation capabilities one thousand times faster than anything we have today – would be instrumental in driving energy research and potentially ending our dependence on fossil fuels. Exascale systems could give us impressive breakthroughs in better understanding of DNA to the point of new drug discovery, new healthcare procedures – and ultimately have an impact on the cost of healthcare in this country.

But getting there requires a new attitude – a fresh way of thinking. There is no evidence – in any product roadmaps from the computer manufacturers – that we can actually achieve exascale by the end of this decade.

We need a great big reality check. An exascale reality check.

We are in great danger of missing the mark – and we are wasting precious time by not putting forth a consolidated, long-term, and solidly funded exascale initiative. The risk is real. Technology leadership translates to economic gains. Loss of technology leadership translates to economic loss.

What do you say when someone says, “Absolutely. We’ll have an exascale system by 2018?”

I say – Possibly, but not likely. Perhaps a very, very large physical installation capable of hitting some benchmark – maybe – but even that is doubtful. A stunt machine at best.

In a discussion with Pete Ungaro, CEO of Cray several months ago, he summed up the challenge quite well in a matter of fact statement, “There are really only two challenges that will keep us from getting to exascale: funding, and finding an affordable power solution.”

While I personally see those two items as being intertwined, I agree with Pete. And to a certain degree, I do believe we can do just about anything. It just takes money. And in this case – a lot of money.

Let me be clear. This isn’t about a “race” to exascale. And it’s not about exascale for the sake of achieving another milestone. This really is about making the investment today to turn this country’s economy around in a realistic timeframe. Exascale represents the potential to change the entire financial model of healthcare. It offers the best approach to eliminating the nation’s dependence on oil – and finding and understanding practical alternative energy solutions such as biofuels or fusion energy. Exascale-level computation could provide weather prediction models that would be a thousand times faster than what we have today. The potential financial benefit to this nation – to millions of Americans – is so staggering that I have a difficult time understanding why we don’t have a coordinated national exascale initiative – with all the funding stops being pulled out.

That’s a Pied Piper I’d be happy to follow.

Honestly, I’m not trying to make a political statement. I’m just saying that as a nation, we need to get our heads out of the sand and step up now.

Computational capabilities a thousand times faster than anything we have today changes everything. Everyone wins.

In a recent poll of several dozen industry thought leaders, I received 100% agreement on the following point. The U.S. is not demonstrating a strong enough commitment to long-term, sustained exascale funding.

“Reaching exascale with the funding commitments we have today is like trying to build an aircraft carrier with popsicle sticks. And first we have to eat the popsicles.” anonymous

The future of exascale for the U.S. at this time has fallen onto the doorstep of DOE. But maybe this challenge – the scale and scope of this effort- is too large for DOE. Maybe it’s too large for any one agency.

I’ve recently taken the time to carefully read and absorb the “Exascale Workshop Panel Meeting Report” from the January 2010 workshop sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research.

This report, also referred to by insiders as the Trivelpiece Report (referring to Alvin W. Trivelpiece who served as Director of the Office of Energy Research, now the Office of Science, from 1981-1987, and Director of ORNL from 1989-2000) was prepared by a prestigious Exascale Review Panel, and attempts to summarize the collective findings of several groups of scientists and engineers directly or indirectly related to U.S. Department of Energy programs, who participated in various town meetings and workshops held during the preceding several years.

Some of the discussion in this report is enlightening, but it is also written with a certain conservative tone – I’m sure purposely – which in my opinion – seems to distract from the sense of urgency that’s required.

But, this report does emphasize something I’m trying to make crystal clear. I quote the following from the “Exascale Workshop Panel Meeting Report”

“In the past, market forces alone have not been sufficient to attract private investments necessary to bring about major advances in supercomputer development and significant investment has been necessary.”

As stated, there is no incentive for the computer manufacturers in private industry, challenged with meeting shareholder expectations on a quarterly basis, to launch the long-term research efforts necessary to get us to exascale.

Without a government-led initiative, the industry will put its attention to shorter-term technology investments for market growth. This was true in the past, but even more so today.

In some cases, it appears we are hoping to find the breakthroughs we will need for exascale by a “bottoms up” approach. Watch the developments in Cloud Computing; watch the developments of batteries in cell phones and laptops; maybe we’ll see something that will help us get to exascale.

This is so wrong.

We need to be taking a top down approach – and enable the research into the technology we will need for exascale to create a trickle down – or flow down effect – and that will impact these closer-term consumer markets.

If you haven’t seen the “Exascale Workshop Panel Meeting Report” , it is worth reading. Take a look at page 11, the third paragraph. I applaud the Workshop Panel for the clarity of this text – and I can only hope that everyone on Capitol Hill would take this to heart. (the underlines are mine.)

“The advance to exascale computing will require overcoming significant technical challenges (as summarized in “Exascale Computer Challenges”). Only the federal government can mobilize the technical talent and make the required investments to achieve an effective exascale computing capability in a timely manner. Moreover, only the federal government’s missions can mobilize the nation’s capabilities into a program where a broad range of scientific applications would drive development of computer software, architectures, and instruments to produce both the necessary computer hardware and a problem-solving environment that is optimized for computational science and engineering. It is important to recognize – the benefits to the federal government and the nation – would be significant in achieving this goal of optimized hardware and software.”

What we obviously need to do is put aside the politics and competitive nature of the various agencies, and bring together the resources of DOE, DARPA, NSF and the intelligence community – to drive a coordinated, massive, research and development effort to make exascale-level computation a reality – in a reasonable timeframe.

The report includes a discussion regarding the suggested benefits to the U.S. national economy of an exascale initiative, but perhaps more important is the suggested potential consequences of not moving forward with an exascale initiative, referring to the impact on international competitiveness and national security.

One can read this report as a U.S. perspective, which of course it is, however, the exact same sentiment is being felt and articulated in other countries – and perhaps with more sense of urgency than we are seeing from the American government.

I feel it is also important to point out that while this report was primarily intended to focus on one agency – the U.S. Department of Energy, and the benefits as they relate to DOE programs, there is no doubt that an organized, national exascale initiative would reap benefits well beyond the domain of DOE.

We have to start somewhere

There is much more at stake here than winning a race. The years of development required to go down the exascale path will unarguably lead to many new ideas, new research paths, and new developments. The benefits can be far-reaching to many industries. On the other hand, the consequences of failing to invest appropriately in moving exascale development efforts forward with what amounts to a national, collaborative organized initiative could have a devastating economic impact on multiple industries, possibly even putting the U.S. behind other nations in terms of industrial competitiveness.

Far too many public presentations and lengthy studies are describing the tremendous challenges we face on the road to exascale, which of course we’ve touched on here as well. I think everyone gets it by now. This is extremely high on the difficulty meter. So, what are the options?

Well, there’s of course business as usual. Let the computer vendors crank out incremental improvements as they can – and we’ll find clever ways to stretch the system capabilities with tools and techniques. OK, even the computer manufacturers will tell us that this approach is not going to get us there.

The current approach seems to be a government funding model that puts a little money here – and a little money there – and pays for disjointed teams to investigate – and if any particular research looks promising – well, maybe then some agency will find a little more money to throw at it. There is reluctance to jump in with both feet and there is disagreement as to the importance of significant funding. There is also the history that up to this point demonstrates HPC progress has come without too much disruption. Government funding was always there to help private industry push the envelope. Significant computing breakthroughs were mostly achieved when private industry was subsidized by the U.S. Government to tackle technology and performance challenges that were somewhat beyond what those manufacturers needed to feed their bottom lines. But, even with teraFLOPS and petaFLOPS, the vendors could see the potential ROI for their higher volume market segments. Exascale is such a unique beast – the potential ROI to consumer markets and high volume computing markets is just too far away.

Intel, IBM, HP, Cray and others are certainly looking at exascale and allocating resources to various research disciplines, but at the same time, they need to also use these resources to support current and close-term product lines – and will at some point over the next few years, have to review the ROI of those research efforts and balance those investments with keeping the companies profitable in the near term. That is not a formula for reaching the breakthroughs necessary to achieve exascale during this decade.

Without government support – long term, fully committed support, no one company can make the long-term R&D investment necessary to give us the breakthrough technologies we will need to hit an exascale goal.

The Exascale Review Panel assembled by the DOE recommended that DOE initiate a program to help ensure exascale capability in a timely manner. Right now, most people are talking about the 2018 timeframe based on some roadmaps drawn up several years ago. Based on the recent discussions I’ve had with numerous people, I’d say we are looking at possibly 2020 for a very limited system – not a fully functional system capable of running more than one or two specifically designed and written applications. It would be another 3-5 years, at least, before we would see any practical use from these systems.

But just imagine the breakthroughs that we could experience as we start to take advantage of this unprecedented level of computation.

There is really no limit to the industries that can benefit from exascale-level computation. And, in some cases, the potential benefit should be enough to motivate government funding at ten times the level we are currently seeing. And I’m not talking about any “race” or the psychological or political importance of being first. I’m talking about the benefits to an entire nation and our ability to radically change the lives of millions.

There is an absolutely critical need to accelerate discovery – and it’s not limited to just the U.S. But, if the U.S. doesn’t get its act together – and move this type of R&D ahead at a more steady and committed pace, this nation will fall behind in ways that will undermine the country’s ability to be competitive.

Having a unified exascale plan – as a national strategic initiative would be very difficult. Some people say it would be impossible. Impossible? Wait…what happened to, “This is the U.S.A. We can do anything …if we put our minds to it.”

Pieces of an exascale program are underway. Many different efforts. Lots of meetings and reports. Many different, disconnected research projects.

So where are we really? Is the U.S. HPC community following a Pied Piper? Of course not. That’s silly.

We’re attempting to follow a dozen Pied Pipers – down multiple paths, across multiple bridges – and hoping we all end up in the same place.

We need unified leadership – a unified, coordinated program – and we shouldn’t be following the music of a Pied Piper. We should be working in harmony to make the music. We should all be playing in the same symphony.

I urge each of you to join me in asking for an exascale reality check. Join me in raising awareness for a unified, national exascale program. Let’s bring an entire community together – Let’s make great music together – our own music – and let’s do it before it’s time to Pay the Piper.

For related stories, visit The Exascale Report Archives.

Comments

  1. guestpass says
  2. guestpass says
  3. jeff2011 says
  4. guestpass says