“This talk will describe one new effort to embed best practices for reproducible scientific computing into traditional university curriculum. In particular, a set of open source, liberally licensed, IPython (now Jupyter) notebooks are being developed and tested to accompany a book “Effective Computation in Physics.” These interactive lecture materials lay out in-class exercises for a project-driven upper-level undergraduate course and are accordingly intended to be forked, modified and reused by professors across universities and disciplines.”
“My team at the University of Minnesota has been collaborating with the team of Falk Herwig at the University of Victoria to simulate brief events in the lives of stars that can greatly affect the heavy elements they synthesize in their interiors and subsequently expel into the interstellar medium. These events are caused by the ingestion of highly combustible hydrogen-rich fuel into the convection zone above a helium burning shell in the deeper interior. Although these events are brief, it can take millions of time steps to simulate the dynamics in sufficient detail to capture subtle aspects of the hydrogen ingestion. To address the computational challenge, we exploit modern multicore and many-core processors and also scale the simulations to run efficiently on over 13,000 nodes of NSF’s Blue Waters machine at NCSA.”
The National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign is helping change the way genetic medicine is researched and practiced in Africa. Members of the Blue Waters team recently made it possible to discover genomic variants in over 300 deeply sequenced human samples to help construct a genotyping chip specific for […]
The NCSA Blue Waters project is offering a Workflows Workshop virtual course in August. To share this class with as many students as possible, they are seeking universities willing to be a local site and offer the course to their students.
“The US, like the EU and other countries, is engaged in a national initiative that aims to deploy exascale computing platforms early in the next decade. The outlines of such platforms are starting to emerge. We shall survey, in our talk, the current roadmap for exascale computing and the main challenges this roadmap entails. We shall also discuss the likely evolution of HPC beyond exascale, in the “post-Moore” era.”
IDC will hold a pair of simultaneous events on Tuesday, June 21 at 7:30am – 10:00am at ISC 2016 in Frankfurt. “IDC cordially invites you and other HPC community members to attend our annual HPC market update breakfast briefing at the ISC’16 conference in Frankfurt. This year, for the first time, we will hold two different events – the annual update on the whole HPC market, and at the request of the ISC organizers, a separate presentation focused on the HPC market for industry/commerce.”
Steve Oberlin, chief technology officer for accelerated computing at NVIDIA, will give two NCSA 30th Anniversary Featured Lectures on May 26. The morning talk is tailored for NCSA staff, Computer Science, and Electrical and Computer Engineering students and faculty. The second talk is open to the public.
The Blue Waters project at the University of Illinois is offering a new graduate course entitled Introduction to High Performance Computing. The course will be offered as a collaborative, online course for multiple participating institutions fall semester 2016. “The project is seeking university partners that are interested in offering the course for credit to their students. The course includes online video lectures, quizzes, and homework assignments with access to free accounts on the Blue Waters system.”
Ensuring reliability and reproducibility in computational research raises unique challenges in the supercomputing context. Specialized architectures, extensive and customized software, and complex workflows all raise barriers to transparency, while established concepts such as validation, verification, and uncertainty quantification point ways forward. The topic has attracted national attention: President Obama’s July 2015 Executive Order, “Creating a National Strategic Computing Initiative,” includes accessibility and workflow capture as objectives; an XSEDE14 workshop released a report, “Standing Together for Reproducibility in Large-Scale Computing”; on May 5, 2015, ACM Transactions in Mathematical Software began the Replicated Computational Results Initiative; and this conference is host to a new workshop, “Numerical Reproducibility at Exascale,” to name but a few examples. In this context, I will outline a research agenda to establish reproducibility and reliability as a cornerstone of scientific computing.
The NSF has awarded $300K to NCSA to examine effective practices in industrial HPC. Led by Principal Investigator Merle Giles, the project will identify, document, and analyze effective practices in establishing public-private partnerships between High Performance Computing (HPC) centers and industry. With the market analysis firm IDC, the project will conduct a worldwide in-depth survey of 70-80 example partnerships of HPC centers of various sizes, in the US and elsewhere, that have been involved in partnerships with the private-sector.