Sign up for our newsletter and get the latest HPC news and analysis.

Supercomputing the Mass Difference of Neutron and Protons

The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible.

Researchers using the JUQUEEN supercomputer have computed the small mass difference between protons and neutrons. The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible. The fact that the neutron is slightly more massive […]

Video: Prototyping Byte-Addressable NVM Access

metzler

In this video from the 2015 OFS Developer’s Workshop, Bernard Metzler presents: Prototyping Byte-Addressable NVM Access.

IBM’s Dave Turek on the First Look at OpenPOWER Hardware for HPC

dave

This week at the OpenPOWER Summit in San Jose, the OpenPOWER Foundation showed off real hardware for the first time with 13 systems including a a prototype HPC server from IBM and new microprocessor customized for China. Built collaboratively by OpenPOWER members, the new solutions exploit the POWER architecture to provide more choice, customization and performance to customers, including hyperscale data centers.

Radio Free HPC Wraps up the 2015 GPU Technology Conference

bubble

In this episode, the Radio Free HPC team wraps up the GPU Technology Conference. The theme of the show this year was Deep Learning, a topic that is heating up the market for GPUs with challenges like image recognition and self-driving cars. As a sister conference, the OpenPOWER Summit this week in San Jose showcased the first OpenPower hardware, including a prototype HPC server from IBM that will pave the way to the two IBM/Nvidia/Mellanox Coral supercomputers expected in 2017.

Interview: Why Software Defined Infrastructure Makes Sense for HPC

Jay Muelhoefer, IBM

“I came to IBM via the acquisition of Platform Computing. There’s also been other IBM assets around HPC, namely GPFS. What’s been the evolution of those items as well and how they really come together under this concept of software-defined infrastructure, and how we’re now taking these capabilities and expanding them into other initiatives that have sort of bled into the HPC space.”

IBM Platform Computing Delivers New HPC Cloud Offerings

HPC Cloud

Clusters that are purchased for specific applications tend not to be flexible as workloads change. What is needed is an infrastructure that can expand or contract as the workload changes. IBM, a recognized leader in High Performance Computing is applying its expertise in both HPC and Cloud computing to bring together the technologies to create the HPC Cloud.

Test Bed Systems Pave the Way for 150 Petaflop Summit Supercomputer

Philip Curtis, a member of the High-Performance Computing Operations group at the OLCF, works with Pike, one of the test systems being used to prepare for Summit.

Oak Ridges is preparing for their upcoming Summit supercomputer with two modest test bed systems using Power8 processors. “Summit will deliver more than five times the computational performance of Titan’s 18,688 nodes, using only approximately 3,400 nodes when it arrives in 2017.”

Slidecast: Software Defined Infrastructure for HPC

jay

“Imagine an entire IT infrastructure controlled not by hands and hardware, but by software. One in which application workloads such as big data, analytics, simulation and design are serviced automatically by the most appropriate resource, whether running locally or in the cloud. A Software Defined Infrastructure enables your organization to deliver IT services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs.”

Clusters For Dummies

Clusters for Dummies

Clusters FOR DUMMIES is an excellent read for both experienced and novice administrators, users, purchasing departments, and developers who are considering purchasing or specifying a cluster.

IBM Redefines Storage Economics with New Software

ibm

“A new approach is needed to help clients address the cost and complexity driven by tremendous data growth,” said Tom Rosamilia, Senior Vice President, IBM Systems. “Traditional storage is inefficient in today’s world where the value of each piece of data is changing all the time. IBM is revolutionizing storage with our Spectrum Storage software that helps clients to more efficiently leverage their hardware investments to extract the full business value of data.”