In this special guest feature from Scientific Computing World, Cray’s Barry Bolding gives some predictions for the supercomputing industry in 2017. “2016 saw the introduction or announcement of a number of new and innovative processor technologies from leaders in the field such as Intel, Nvidia, ARM, AMD, and even from China. In 2017 we will continue to see capabilities evolve, but as the demand for performance improvements continues unabated and CMOS struggles to drive performance improvements we’ll see processors becoming more and more power hungry.”
Today the DOE Exascale Computing Project announced the following changes to their strategic plan. The ECP project now plans to deploy the first Exascale system in the U.S. in 2021, a full 1-2 years earlier than previously planned. This system will be built from a “novel architecture” that will be put out for bid in the near future. According to Argonne’s Paul Messina, Director, Exascale Computing Project, “It won’t be something out there like quantum computing, but we are looking for new ideas in terms of processing and networking technologies for the machine.”
Today the Department of Energy’s Exascale Computing Project (ECP) today announced the selection of 35 software development proposals representing 25 research and academic organizations. “After a lengthy review, we are pleased to announce that we have selected 35 proposals for funding. The funding of these software development projects, following our recent announcement for application development awards, signals the momentum and direction of ECP as we bring together the necessary ecosystem and infrastructure to drive the nation’s exascale imperative.”
Today Argonne announced that the Lab is leading a pair of newly funded applications projects for the Exascale Computing Project (ECP). The announcement comes on the heels of news that ECP has funded a total of 15 application development proposals for full funding and seven proposals for seed funding, representing teams from 45 research and academic organizations.
Paul Messina presented this talk at the HPC User Forum in Austin. “The Exascale Computing Project (ECP) is a collaborative effort of the Office of Science (DOE-SC) and the National Nuclear Security Administration (NNSA). As part of President Obama’s National Strategic Computing initiative, ECP was established to develop a new class of high-performance computing systems whose power will be a thousand times more powerful than today’s petaflop machines.”
Today Lawrence Berkeley National Laboratory announced that LBNL scientists will lead or play key roles in developing 11 critical research applications for next-generation supercomputers as part of DOE’s Exascale Computing Project (ECP).
Today, the National Science Foundation (NSF) announced two major awards to establish Scientific Software Innovation Institutes (S2I2). The awards, totaling $35 million over 5 years, will support the Molecular Sciences Software Institute and the Science Gateways Community Institute, both of which will serve as long-term hubs for scientific software development, maintenance and education. “The institutes will ultimately impact thousands of researchers, making it possible to perform investigations that would otherwise be impossible, and expanding the community of scientists able to perform research on the nation’s cyberinfrastructure,” said Rajiv Ramnath, program director in the Division of Advanced Cyberinfrastructure at NSF.”
Disruptive Opportunities and a Path to Exascale: A Conversation with HPC Visionary Alan Gara of Intel
“We want to encourage and support that collaborative behavior in whatever way we can, because there are a multitude of problems in government agencies and commercial entities that seem to have high performance computing solutions. Think of bringing together the tremendous computational expertise you find from the DOE labs with the problems that someone like the National Institutes of Health is trying to solve. You couple those two together and you really can create something amazing that will affect all our lives. We want to broaden their exposure to the possibilities of HPC and help that along. It’s important, and it will allow all of us in HPC to more broadly impact the world with the large systems as well as the more moderate-scale systems.”
A newly released report commissioned by the National Science Foundation (NSF) and conducted by National Academies of Sciences, Engineering, and Medicine examines priorities and associated trade-offs for advanced computing investments and strategy. “We are very pleased with the National Academy’s report and are enthusiastic about its helpful observations and recommendations,” said Irene Qualters, NSF Advanced Cyberinfrastructure Division Director. “The report has had a wide range of thoughtful community input and review from leaders in our field. Its timing and content give substance and urgency to NSF’s role and plans in the National Strategic Computing Initiative.”
In this video from the HPC User Forum in Tucson, Saul Gonzalez Martirena from NSF provides an update on the NSCI initiative. “As a coordinated research, development, and deployment strategy, NSCI will draw on the strengths of departments and agencies to move the Federal government into a position that sharpens, develops, and streamlines a wide range of new 21st century applications. It is designed to advance core technologies to solve difficult computational problems and foster increased use of the new capabilities in the public and private sectors.”