Rich Brueckner from insideHPC moderated this panel discussion on current trends in HPC. “President Obama’s Executive Order establishing the National Strategic Computing Initiative (NSCI) will set the stage for a new chapter in leadership computing for the United States. In this panel discussion, thought leaders from leading supercomputing vendors share their perspectives on current HPC trends and the way forward.”
In this video from the Disruptive Technologies Panel at the HPC User Forum, Peter Braam from Cambridge University presents: Processing 1 EB per Day for the SKA Radio Telescope. “The Square Kilometre Array is an international effort to investigate and develop technologies which will enable us to build an enormous radio astronomy telescope with a million square meters of collecting area.”
In this video from the Disruptive Technologies Session at the 2015 HPC User Forum, Nick New from Optalysis describes the company’s optical processing technology. “Optalysys technology uses light, rather than electricity, to perform processor intensive mathematical functions (such as Fourier Transforms) in parallel at incredibly high-speeds and resolutions. It has the potential to provide multi-exascale levels of processing, powered from a standard mains supply. The mission is to deliver a solution that requires several orders of magnitude less power than traditional High Performance Computing architectures.”
In this video from the 2015 HPC User Forum, Irene Qualters from the National Science Foundation discusses National Strategic Computing Initiative (NSCI). Established by an Executive Order by President Obama, NSCI has a mission to ensure the United States continues leading high performance computing over the coming decades. As part of the effort, NSCI will foster the deployment of exascale supercomputers to take on the nation’s Grand Challenges.
In this video from the Neuroinformatics 2015 Conference, Thomas Lippert from Jülich presents: Why Does the Human Brain Project Need HPC and Data Analytics Infrastructures? HBP, the human brain project, is one of two European flagship projects foreseen to run for 10 years. The HBP aims at creating an open neuroscience driven infrastructure for simulation and big data aided modeling and research with a credible user program.
Bo Ewald from D-Wave Systems presented this Disruptive Technologies talk at the HPC User Forum. “While we are only at the beginning of this journey, quantum computing has the potential to help solve some of the most complex technical, commercial, scientific, and national defense problems that organizations face. We expect that quantum computing will lead to breakthroughs in science, engineering, modeling and simulation, financial analysis, optimization, logistics, and national defense applications.”
Today Intel announced a 10-year collaborative relationship with the Delft University of Technology and TNO, the Dutch Organization for Applied Research, to accelerate advancements in quantum computing. To achieve this goal, Intel will invest US$50 million and will provide significant engineering resources both on-site and at Intel, as well as technical support. “Quantum computing holds the promise of solving complex problems that are practically insurmountable today, including intricate simulations such as large-scale financial analysis and more effective drug development.”
“I will describe a decade-long, multi-disciplinary, multi-institutional effort spanning neuroscience, supercomputing and nanotechnology to build and demonstrate a brain-inspired computer and describe the architecture, programming model and applications. I also will describe future efforts in collaboration with DOE to build, literally, a “brain-in-a-box”. The work was built on simulations conducted on Lawrence Livermore National Laboratory’s Dawn and Sequoia HPC systems in collaboration with Lawrence Berkeley National Laboratory.”