Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Towards Reproducible Data Analysis Using Cloud and Container Technologies

Sergio Maffioletti from the University of Zurich gave this talk at the hpc-ch forum on Cloud and Containers. “In this talk, we’ll provide an overview of the challenges faced by both research infrastructure providers and Science IT units, along with best practices to improve the reproducibility of data analysis using cloud and container technologies.”

Call for Submissions: SC18 Workshop on Reproducibility

Over at the SC18 Blog, Stephen Lien Harrell from Purdue writes that the conference will host will host a workshop on the hot topic of Reproducibility. Their Call for Submissions is out with a deadline of August 19, 2018. ”
The Systems Professionals Workshop is a platform for discussing the unique challenges and developing the state of the practice for the HPC systems community. The program committee is soliciting submissions that address the best practices of building and operating high performance systems with an emphasis on reproducible solutions that can be implemented by systems staff at other institutions.”

John Gustafson to host BoF on Posit Arithmetic at SC17

John Gustafson from A*STAR will host a BoF on Posit arithmetic at SC17. Entitled, “Improving Numerical Computation with Practical Tools and Novel Computer Arithmetic,”  this BOF will be co-hosted by Mike Lam with discussions on tools for measuring floating point accuracy. “This approach obtains more accurate answers than floating-point arithmetic yet uses fewer bits in many cases, saving memory, bandwidth, energy, and power.”

SC17 Student Cluster Competition Continues Push for Reproducibility

“The SCC reproducibility program is part of a wider effort to encourage authors submitting papers to the conference to voluntarily complete an appendix to their paper that described the details of their software environment and computational experiments to the extent that an independent person could replicate their results.”

Video: Reproducibility in High Performance Computing

Ensuring reliability and reproducibility in computational research raises unique challenges in the supercomputing context. Specialized architectures, extensive and customized software, and complex workflows all raise barriers to transparency, while established concepts such as validation, verification, and uncertainty quantification point ways forward. The topic has attracted national attention: President Obama’s July 2015 Executive Order, “Creating a National Strategic Computing Initiative,” includes accessibility and workflow capture as objectives; an XSEDE14 workshop released a report, “Standing Together for Reproducibility in Large-Scale Computing”; on May 5, 2015, ACM Transactions in Mathematical Software began the Replicated Computational Results Initiative; and this conference is host to a new workshop, “Numerical Reproducibility at Exascale,” to name but a few examples. In this context, I will outline a research agenda to establish reproducibility and reliability as a cornerstone of scientific computing.