“There are only two natural disasters that could impact the entire U.S.,” according to Gabor Toth, professor of Climate and Space Sciences and Engineering at the University of Michigan. “One is a pandemic and the other is an extreme space weather event.” We’re currently seeing the effects of the first in real-time. The last major space […]
Univ. of Michigan Researchers Using TACC Frontera HPC for Space Weather Forecasting
Video: A Look at the National Strategic Computing Initiative
In this video, Dr. Carl J. Williams, Deputy Director of the Physical Measurement Laboratory at the National Institute of Standards and Technology within the United States Department of Commerce, reviews the National Strategic Computing Initiative. Issued by Executive Order, the initiative aims to maximize benefits of high-performance computing research, development and deployment.
Barry Bolding from Cray Shares Four Predictions for HPC in 2017
In this special guest feature from Scientific Computing World, Cray’s Barry Bolding gives some predictions for the supercomputing industry in 2017. “2016 saw the introduction or announcement of a number of new and innovative processor technologies from leaders in the field such as Intel, Nvidia, ARM, AMD, and even from China. In 2017 we will continue to see capabilities evolve, but as the demand for performance improvements continues unabated and CMOS struggles to drive performance improvements we’ll see processors becoming more and more power hungry.”
New Plan: ECP Project to Deploy First Exascale System by 2021
Today the DOE Exascale Computing Project announced the following changes to their strategic plan. The ECP project now plans to deploy the first Exascale system in the U.S. in 2021, a full 1-2 years earlier than previously planned. This system will be built from a “novel architecture” that will be put out for bid in the near future. According to Argonne’s Paul Messina, Director, Exascale Computing Project, “It won’t be something out there like quantum computing, but we are looking for new ideas in terms of processing and networking technologies for the machine.”
Exascale Computing Project Awards $34 Million for Software Development
Today the Department of Energy’s Exascale Computing Project (ECP) today announced the selection of 35 software development proposals representing 25 research and academic organizations. “After a lengthy review, we are pleased to announce that we have selected 35 proposals for funding. The funding of these software development projects, following our recent announcement for application development awards, signals the momentum and direction of ECP as we bring together the necessary ecosystem and infrastructure to drive the nation’s exascale imperative.”
Argonne to Develop Applications for ECP Exascale Computing Project
Today Argonne announced that the Lab is leading a pair of newly funded applications projects for the Exascale Computing Project (ECP). The announcement comes on the heels of news that ECP has funded a total of 15 application development proposals for full funding and seven proposals for seed funding, representing teams from 45 research and academic organizations.
Video: The ECP Exascale Computing Project
Paul Messina presented this talk at the HPC User Forum in Austin. “The Exascale Computing Project (ECP) is a collaborative effort of the Office of Science (DOE-SC) and the National Nuclear Security Administration (NNSA). As part of President Obama’s National Strategic Computing initiative, ECP was established to develop a new class of high-performance computing systems whose power will be a thousand times more powerful than today’s petaflop machines.”
Berkeley Lab to Develop Key Applications for ECP Exascale Computing Project
Today Lawrence Berkeley National Laboratory announced that LBNL scientists will lead or play key roles in developing 11 critical research applications for next-generation supercomputers as part of DOE’s Exascale Computing Project (ECP).
NSF to Invest $35 million in Scientific Software
Today, the National Science Foundation (NSF) announced two major awards to establish Scientific Software Innovation Institutes (S2I2). The awards, totaling $35 million over 5 years, will support the Molecular Sciences Software Institute and the Science Gateways Community Institute, both of which will serve as long-term hubs for scientific software development, maintenance and education. “The institutes will ultimately impact thousands of researchers, making it possible to perform investigations that would otherwise be impossible, and expanding the community of scientists able to perform research on the nation’s cyberinfrastructure,” said Rajiv Ramnath, program director in the Division of Advanced Cyberinfrastructure at NSF.”
Disruptive Opportunities and a Path to Exascale: A Conversation with HPC Visionary Alan Gara of Intel
“We want to encourage and support that collaborative behavior in whatever way we can, because there are a multitude of problems in government agencies and commercial entities that seem to have high performance computing solutions. Think of bringing together the tremendous computational expertise you find from the DOE labs with the problems that someone like the National Institutes of Health is trying to solve. You couple those two together and you really can create something amazing that will affect all our lives. We want to broaden their exposure to the possibilities of HPC and help that along. It’s important, and it will allow all of us in HPC to more broadly impact the world with the large systems as well as the more moderate-scale systems.”