Today Data Vortex Technologies announced that the company has sold and delivered a DV205 system, “PEPSY”, to the Pacific Northwest National Laboratory (PNNL). PEPSY is specifically designed to solve problems requiring extensive processor-to-processor communication in parallel computing systems.
Jim Ganthier from Dell presented this talk at the HPC User Forum. “Dell HPC solutions are deployed across the globe as the computational foundation for industrial, academic and governmental research critical to scientific advancement and economic and global competitiveness. With the richness of the Dell enterprise portfolio, HPC customers are increasingly relying on Dell HPC experts to provide integrated, turnkey solutions and services resulting in enhanced performance, reliability and simplicity.”
As an open source tool designed to navigate large amounts of data, Hadoop continues to find new uses in HPC. Managing a Hadoop cluster is different than managing an HPC cluster, however. It requires mastering some new concepts, but the hardware is basically the same and many Hadoop clusters now include GPUs to facilitate deep learning.
Toni Cortés from the Barcelona Supercomputing Center presented this talk at the HPC Advisory Council Spain Conference. “BSC is the National Supercomputing Facility in Spain and was officially constituted in April 2005. BSC-CNS manages MareNostrum, one of the most powerful supercomputers in Europe, located at the Torre Girona chapel. The mission of BSC-CNS is to investigate, develop and manage information technology in order to facilitate scientific progress.”
In this video plus transcripts from the 2015 HPC User Forum in Broomfield, Bob Sorensen from IDC moderates a panel discussion on the the National Strategic Computing Initiative (NSCI). “Established by an Executive Order by President Obama, the National Strategic Computing Initiative has a mission to ensure the United States continues leading high performance computing over the coming decades. As part of the effort, NSCI will foster the deployment of exascale supercomputers to take on the nation’s Grand Challenges.”
In this video from the 2015 HPC User Forum, Will Koella from the Department of Defense discusses National Strategic Computing Initiative (NSCI). Established by an Executive Order by President Obama, NSCI has a mission to ensure the United States continues leading high performance computing over the coming decades. As part of the effort, NSCI will foster the deployment of exascale supercomputers to take on the nation’s Grand Challenges.
In this video from the Neuroinformatics 2015 Conference, Thomas Lippert from Jülich presents: Why Does the Human Brain Project Need HPC and Data Analytics Infrastructures? HBP, the human brain project, is one of two European flagship projects foreseen to run for 10 years. The HBP aims at creating an open neuroscience driven infrastructure for simulation and big data aided modeling and research with a credible user program.
In this video, Alexandru Iosup from the TU Delft presents: Scalable High Performance Systems. “During this masterclass, Alexandru discussed several steps towards addressing interesting new challenges which emerge in the operation of the datacenters that form the infrastructure of cloud services, and in supporting the dynamic workloads of demanding users. If we succeed, we may not only enable the advent of big science and engineering, and the almost complete automation of many large-scale processes, but also reduce the ecological footprint of datacenters and the entire ICT industry.”
The Alan Turing Institute is the UK’s national institute for data science. It has marked its first few days of operations with the announcement of its new director, the confirmation of £10 million of research funding from Lloyd’s Register Foundation, a research partnership with GCHQ, collaboration with the EPSRC and Cray, and the commencement of its first research activities.