MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


European Commission Steps Up Funding of HPC

In this special guest feature, Tom Wilkie from Scientific Computing World reports that the European Commission is funding research projects and centers of excellence as part of its strategy to coordinate European HPC efforts. In October, the EC made a series of announcements on how it is going to invest some of the €700 million allocated to its Public-Private Partnership on high performance computing.

PNNL Launches CENATE Computing Proving Ground

Pacific Northwest National Laboratory has opened the CENATE Center for Advanced Technology Evaluation, a first-of-its-kind computing proving ground. Designed to shape future extreme-scale computing systems, CENATE evaluations will mostly concern processors; memory; networks; storage; input/output; and the physical aspects of certain systems, such as sizing and thermal effects.

Cray Opens EMEA Research Lab in Bristol

Today Cray announced the creation of the Cray Europe, Middle East and Africa (EMEA) Research Lab. The Cray EMEA Research Lab will foster the development of deep technical collaborations with key customers and partners, and will serve as the focal point for the Company’s technical engagements with the European HPC ecosystem.

New Paper: Can 3D-Stacking Topple the Memory Wall?

Can 3D-stacking technology topple the long-standing “memory wall” that’s been holding back HPC application performance? A new paper from the Barcelona Supercomputing Center written in collaboration with experts from Chalmers University and Lawrence Livermore National Laboratory concludes that it will take more than just the simple replacement of conventional DIMMs with 3D-stacked devices.

Radio Free HPC Looks at Highlights from Fall 2015 HPC Conferences

In this podcast, the Radio Free HPC team goes over a Trip Report from Rich Brueckner from insideHPC, who’s been on the road at a series of HPC conferences. We captured more that 50 talks in the past month, and we have them all right here with the very latest in High Performance Computing.

D-Wave to Collaborate with Google, NASA, and USRA on Quantum Computing

Today D-Wave Systems announced a new agreement covering the installation of a succession of D-Wave systems located at NASA’s Ames Research Center. “The new agreement is the largest order in D-Wave’s history, and indicative of the importance of quantum computing in its evolution toward solving problems that are difficult for even the largest supercomputers,” said D-Wave CEO Vern Brownell. “We highly value the commitment that our partners have made to D-Wave and our technology, and are excited about the potential use of our systems for machine learning and complex optimization problems.”

Video: HPC Technology Panel at the PBS Works User Group

Rich Brueckner from insideHPC moderated this panel discussion on current trends in HPC. “President Obama’s Executive Order establishing the National Strategic Computing Initiative (NSCI) will set the stage for a new chapter in leadership computing for the United States. In this panel discussion, thought leaders from leading supercomputing vendors share their perspectives on current HPC trends and the way forward.”

Video: Processing 1 Exabyte per Day for the SKA Radio Telescope

In this video from the Disruptive Technologies Panel at the HPC User Forum, Peter Braam from Cambridge University presents: Processing 1 EB per Day for the SKA Radio Telescope. “The Square Kilometre Array is an international effort to investigate and develop technologies which will enable us to build an enormous radio astronomy telescope with a million square meters of collecting area.”

Optalysys: Disruptive Optical Processing Technology for HPC

In this video from the Disruptive Technologies Session at the 2015 HPC User Forum, Nick New from Optalysis describes the company’s optical processing technology. “Optalysys technology uses light, rather than electricity, to perform processor intensive mathematical functions (such as Fourier Transforms) in parallel at incredibly high-speeds and resolutions. It has the potential to provide multi-exascale levels of processing, powered from a standard mains supply. The mission is to deliver a solution that requires several orders of magnitude less power than traditional High Performance Computing architectures.”

Transcript: Irene Qualters from the NSF Discusses the NSCI Initiative

In this video from the 2015 HPC User Forum, Irene Qualters from the National Science Foundation discusses National Strategic Computing Initiative (NSCI). Established by an Executive Order by President Obama, NSCI has a mission to ensure the United States continues leading high performance computing over the coming decades. As part of the effort, NSCI will foster the deployment of exascale supercomputers to take on the nation’s Grand Challenges.