The Blue Waters project at the University of Illinois is offering a new graduate course entitled Introduction to High Performance Computing. The course will be offered as a collaborative, online course for multiple participating institutions fall semester 2016. “The project is seeking university partners that are interested in offering the course for credit to their students. The course includes online video lectures, quizzes, and homework assignments with access to free accounts on the Blue Waters system.”
Ensuring reliability and reproducibility in computational research raises unique challenges in the supercomputing context. Specialized architectures, extensive and customized software, and complex workflows all raise barriers to transparency, while established concepts such as validation, verification, and uncertainty quantification point ways forward. The topic has attracted national attention: President Obama’s July 2015 Executive Order, “Creating a National Strategic Computing Initiative,” includes accessibility and workflow capture as objectives; an XSEDE14 workshop released a report, “Standing Together for Reproducibility in Large-Scale Computing”; on May 5, 2015, ACM Transactions in Mathematical Software began the Replicated Computational Results Initiative; and this conference is host to a new workshop, “Numerical Reproducibility at Exascale,” to name but a few examples. In this context, I will outline a research agenda to establish reproducibility and reliability as a cornerstone of scientific computing.
The NSF has awarded $300K to NCSA to examine effective practices in industrial HPC. Led by Principal Investigator Merle Giles, the project will identify, document, and analyze effective practices in establishing public-private partnerships between High Performance Computing (HPC) centers and industry. With the market analysis firm IDC, the project will conduct a worldwide in-depth survey of 70-80 example partnerships of HPC centers of various sizes, in the US and elsewhere, that have been involved in partnerships with the private-sector.
Application deadlines are fast approaching for the Blue Waters Graduate Program and the International Summer School on HPC Challenges in Computational Sciences.
National Center for Supercomputing Applications (NCSA) has a private sector program (PSP) which works with the smaller companies to help them adopt HPC technologies based on the expertise acquired over the past quarter century. By working with these organizations, NCSA can help them to determine the Return on Investment (ROI) of using more computing power to solve real world problems than is possible on smaller, less capable systems.
In this video from SC15, Rich Brueckner from insideHPC moderates a panel discussion on the NSCI initiative. “As a coordinated research, development, and deployment strategy, NSCI will draw on the strengths of departments and agencies to move the Federal government into a position that sharpens, develops, and streamlines a wide range of new 21st century applications. It is designed to advance core technologies to solve difficult computational problems and foster increased use of the new capabilities in the public and private sectors.”
NCSA is now accepting applications for the Blue Waters Graduate Program. This unique program lets graduate students from across the country immerse themselves in a year of focused high-performance computing and data-intensive research using the Blue Waters supercomputer to accelerate their research.
Does your research generate, analyze, and/or visualize data using advanced digital resources? In its recent Call for Participation, the CADENS project is looking for scientific data to visualize or existing data visualizations to weave into larger documentary narratives in a series of fulldome digital films and TV programs aimed at broad public audiences. Visualizations of your work could reach millions of people, amplifying its greater societal impacts!
Video: NCSA’s Ed Seidel Testifies on the Networking and Information Technology Research and Development Program
In the video, Ed Seidel from NSCA testifies at a House hearing on the Networking and Information Technology Research and Development (NITRD) Program. NITRD provides a framework in which many Federal agencies come together to coordinate their networking and information technology (IT) research and development (R&D) efforts.
In this video (with transcript) from the 2015 HPC User Forum in Broomfield, Bob Sorenson from IDC moderates a User Agency panel discussion on the NSCI initiative. “You all have seen that usable statement inside the NSCI, and we are all about trying to figure out how to make usable machines. That is a key critical component as far, as we’re concerned. But the thing that I think we’re really seeing, we talked about the fact that a single thread performance is not increasing, and so what we’re doing is we’re simply increasing the parallelism and then the physics limitations, if you will, of how you cool and distribute power among the parts that are there. That really is leading to a paradigm shift from something that’s based on how fast you can crunch the numbers to how fast you can feed the chips with data. It’s really that paradigm shift, I think, more than anything else that’s really going to change the way that we have to do our computing.”