New Report Charts Future Directions for NSF Advanced Computing Infrastructure

A newly released report commissioned by the National Science Foundation (NSF) and conducted by National Academies of Sciences, Engineering, and Medicine examines priorities and associated trade-offs for advanced computing investments and strategy. “We are very pleased with the National Academy’s report and are enthusiastic about its helpful observations and recommendations,” said Irene Qualters, NSF Advanced Cyberinfrastructure Division Director. “The report has had a wide range of thoughtful community input and review from leaders in our field. Its timing and content give substance and urgency to NSF’s role and plans in the National Strategic Computing Initiative.”

NSCI Update from the HPC User Forum

In this video from the HPC User Forum in Tucson, Saul Gonzalez Martirena from NSF provides an update on the NSCI initiative. “As a coordinated research, development, and deployment strategy, NSCI will draw on the strengths of departments and agencies to move the Federal government into a position that sharpens, develops, and streamlines a wide range of new 21st century applications. It is designed to advance core technologies to solve difficult computational problems and foster increased use of the new capabilities in the public and private sectors.”

Video: Panel Discussion on Exascale Computing

In this video from the 2016 Stanford HPC Conference, Gilad Shainer from the HPC Advisory Council moderates a panel discussion on Exascale Computing. “Exascale computing will uniquely provide knowledge leading to transformative advances for our economy, security and society in general. A failure to proceed with appropriate speed risks losing competitiveness in information technology, in our industrial base writ large, and in leading-edge science.”

Video: Reproducibility in High Performance Computing

Ensuring reliability and reproducibility in computational research raises unique challenges in the supercomputing context. Specialized architectures, extensive and customized software, and complex workflows all raise barriers to transparency, while established concepts such as validation, verification, and uncertainty quantification point ways forward. The topic has attracted national attention: President Obama’s July 2015 Executive Order, “Creating a National Strategic Computing Initiative,” includes accessibility and workflow capture as objectives; an XSEDE14 workshop released a report, “Standing Together for Reproducibility in Large-Scale Computing”; on May 5, 2015, ACM Transactions in Mathematical Software began the Replicated Computational Results Initiative; and this conference is host to a new workshop, “Numerical Reproducibility at Exascale,” to name but a few examples. In this context, I will outline a research agenda to establish reproducibility and reliability as a cornerstone of scientific computing.

Creating an Exascale Ecosystem Under the NSCI Banner

“We expect NCSI to run for the next two decades. It’s a bit audacious to start a 20 year project in the last 18 months of an administration, but one of the things that gives us momentum is that we are not starting from a clean sheet of paper. There are many government agencies already involved and what we’re really doing is increasing their coordination and collaboration. Also we will be working very hard over the next 18 months to build momentum and establish new working relationships with academia and industry.”

Radio Free HPC Looks at the Top HPC Tech Stories from 2015

In this podcast, the Radio Free HPC team looks at the Top Technology Stories for High Performance Computing in 2015. “From 3D XPoint memory to Co-Design Architecture and NVM Express, these new approaches are poised to have a significant impact on supercomputing in the near future.” We also take a look at the most-shared stories from 2015.

Video: Dell Panel Discussion on the NSCI initiative from SC15

In this video from SC15, Rich Brueckner from insideHPC moderates a panel discussion on the NSCI initiative. “As a coordinated research, development, and deployment strategy, NSCI will draw on the strengths of departments and agencies to move the Federal government into a position that sharpens, develops, and streamlines a wide range of new 21st century applications. It is designed to advance core technologies to solve difficult computational problems and foster increased use of the new capabilities in the public and private sectors.”

Podcast: Supercomputing the Deep Earth with the Gordon Bell Prize Winners

In this podcast, Jorge Salazar from TACC interviews two winners of the 2015 ACM Gordon Bell Prize, Omar Ghattas and Johann Rudi of the Institute for Computational Engineering and Sciences, UT Austin. As part of the discussion, Ghattas describes how parallelism and exascale computing will propel science forward.

Podcast: Dell Panels on NSCI and the Convergence of Big Data Coming to SC15

In this podcast, Stephen Sofhauser from Dell describes what’s coming up at the company’s exhibit at SC15 in Austin. With a 50×50 exhibit and two booth theaters, Dell will showcase how customers are using their technology to solve their toughest computational problems. “Our own Rich Brueckner from insideHPC will host a pair of panel discussions in the Dell booth #1009 on Wednesday, Nov. 18.”

Los Alamos Orders D-Wave 2X Quantum Computer

Today D-Wave Systems announced that Los Alamos National Laboratory will acquire and install the latest D-Wave quantum computer, the 1000+ qubit D-Wave 2X system. Los Alamos, a multidisciplinary research institution engaged in strategic science on behalf of national security, will lead a collaboration within the Department of Energy and with select university partners to explore the capabilities and applications of quantum annealing technology, consistent with the goals of the government-wide National Strategic Computing Initiative.