Sign up for our newsletter and get the latest HPC news and analysis.


NAG Library adds New Algorithms for Application Developers

nag_logo

Today the Numerical Algorithms Group (NAG) released their latest NAG Library including over 80 new mathematical and statistical algorithms.

Why Storage Matters to Scientists

Tom Wilkie, Scientific Computing World

As the Large Hadron Collider restarts at Cern, data storage has become as important to scientists as compute power. But, as Tom Wilkie from Scientific Computing World reports, the innovative technologies being developed have much wider applications.

Penguin On Demand Powers Oracle Team USA for America’s Cup

teamusa

Today Penguin Computing announced that Oracle Team USA is using Penguin Computing on Demand (POD) in conjunction with NUMECA’s FINE/Marine CFD software for hydrodynamic modeling.

Oakley Cluster Powers Satellite Surface Mapping for Disaster Relief in Nepal

Two university research teams are employing satellite imagery and supercomputers to produce high-resolution images to aid the Nepali earthquake relief effort. This image is a hillshade-rendered Digital Terrain Model image of the Kathmandu Valley, Nepal, created by SETSM software.

Researchers are using supercomputer automated surface mapping technology to help with disaster relief and longer-term stabilization planning efforts related to the recent earthquake in Nepal.

New Intel Xeons Target Realtime Analytics

xeon

Today Intel announced the new Xeon processor E7-8800/4800 v3 product families, delivering accelerated business insight through real-time analytics.

BlueTides on Blue Waters: The First Galaxies

Figure 1 from The BlueTides Simulation paper; reproduced with permission; 
z = 8 refers to a redshift of 8 when the universe was a little over 1/2 billion years old.

“The largest high-redshift cosmological simulation of galaxy formation ever has been recently completed by a group of astrophysicists (Drs. Feng, Di-Matteo, Croft, Bird, and Battaglia) from the U.S. and the U.K. This tour-de-force simulation was performed on the Blue Waters Cray XE/XK system at NCSA and employed 648,000 cores. They utilized approximately 700 billion particles (!) to represent dark matter and ordinary matter and to create virtual galaxies inside the supercomputer. The authors, who represent Carnegie Mellon University, UC Berkeley, Princeton University, and the University of Sussex, have given their simulation the moniker BlueTides.”

SDSC Trestles Supercomputer to move to University of Arkansas

02-28trestles

SDSC’s recently decommissioned Trestles supercomputer is moving to the Arkansas High Performance Computing Center.

Petascale Comet Supercomputer Enters Early Operations

comet

“Comet is really all about providing high-performance computing to a much larger research community – what we call ‘HPC for the 99 percent’ – and serving as a gateway to discovery,” said SDSC Director Michael Norman, the project’s principal investigator. “Comet has been specifically configured to meet the needs of researchers in domains that have not traditionally relied on supercomputers to solve their problems.”

Numerical Optimization for Deep Learning

phi

“With the advent of massively parallel computing coprocessors, numerical optimization for deep-learning disciplines is now possible. Complex real-time pattern recognition, for example, that can be used for self driving cars and augmented reality can be developed and high performance achieved with the use of specialized, highly tuned libraries. By just using the Message Passing Interface (MPI) API, very high performance can be attained on hundreds to thousands of Intel Xeon Phi processors.”

Thomas Lippert Elected Chairman of Gauss Supercomputing Centre

thomas

Today GCS announced that Professor Dr. Dr. Thomas Lippert has been elected Chairman of the Board of Directors at the Gauss Centre for Supercomputing. Professor Lippert also serves as Director of the Institute for Advanced Simulation at the Forschungszentrum Jülich and Head of the Jülich Supercomputing Centre.