Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lensing Software

Gilles Fourestey from EPFL gave this talk at the Swiss HPC Conference. “LENSTOOL is a gravitational lensing software that models mass distribution of galaxies and clusters. It is used to obtain sub-percent precision measurements of the total mass in galaxy clusters and constrain the dark matter self-interaction cross-section, a crucial ingredient to understanding its nature.”

Why the World’s Largest Telescope Relies on GPUs

Over at the NVIDIA blog, Jamie Beckett writes that the new European-Extremely Large Telescope, or E-ELT, will capture images 15 times sharper than the dazzling shots the Hubble telescope has beamed to Earth for the past three decades. “are running GPU-powered simulations to predict how different configurations of E-ELT will affect image quality. Changes to the angle of the telescope’s mirrors, different numbers of cameras and other factors could improve image quality.”

HACC: Fitting the Universe inside a Supercomputer

Nicholas Frontiere from the University of Chicago gave this talk at the DOE CSGF Program Review meeting. “In response to the plethora of data from current and future large-scale structure surveys of the universe, sophisticated simulations are required to obtain commensurate theoretical predictions. We have developed the Hardware/Hybrid Accelerated Cosmology Code (HACC), capable of sustained performance on powerful and architecturally diverse supercomputers to address this numerical challenge. We will investigate the numerical methods utilized to solve a problem that evolves trillions of particles, with a dynamic range of a million to one.”

SC17 Keynote Looks at the SKA Telescope: Life, the Universe, and Computing

In this special guest feature, Robert Roe reports from the SC17 conference keynote. “Philip Diamond, director general of SKA and Rosie Bolton, SKA regional centre project scientist and project scientist for the international engineering consortium designing the high performance computing systems used in the project, took to the stage to highlight the huge requirements for computation and data processing required by the SKA project.”

Berkeley Lab-led Collaborations win HPC Innovation Awards

Two Berkeley Lab-led projects—Celeste and Galactos—were honored with Hyperion Research’s 2017 HPC Innovation Excellence Awards for “the outstanding application of HPC for business and scientific achievements.” The HPC Innovation Excellence awards are designed to showcase return on investment and success stories involving HPC; to help other users better understand the benefits of adopting HPC; and to help justify HPC investments, including for small and medium-size enterprises.

Podcast: Optimizing Cosmos Code on Intel Xeon Phi

In this TACC podcast, Cosmos code developer Chris Fragile joins host Jorge Salazar for a discussion on how researchers are using supercomputers to simulate the inner workings of Black holes. “For this simulation, the manycore architecture of KNL presents new challenges for researchers trying to get the best compute performance. This is a computer chip that has lots of cores compared to some of the other chips one might have interacted with on other systems,” McDougall explained. “More attention needs to be paid to the design of software to run effectively on those types of chips.”

Video: Supercomputing Models Enable Detection of a Cosmic Cataclysm

In this podcast, Peter Nugent from Berkeley Lab explains how scientists confirmed the first-ever measurement of the merger of two neutron stars and its explosive aftermath. “Simulations succeeded in modeling what would happen in an incredibly complex phenomenon like a neutron star merger. Without the models, we all probably all would have been mystified by exactly what we were seeing in the sky.”

Illinois Supercomputers Tag Team for Big Bang Simulation

Researchers are tapping Argonne and NCSA supercomputers to tackle the unprecedented amounts of data involved with simulating the Big Bang. “Researchers performed cosmological simulations on the ALCF’s Mira supercomputer, and then sent huge quantities of data to UI’s Blue Waters, which is better suited to perform the required data analysis tasks because of its processing power and memory balance.”

How Extreme Energy Jets Escape a Black Hole

Researchers are using XSEDE supercomputers to better understand the forces at work at the center of the Milky Way galaxy. The work could reveal how instabilities develop in extreme energy releases from black holes. “While nothing – not even light – can escape a black hole’s interior, the jets somehow manage to draw their energy from the black hole.”

Video: Dark Matter – Detecting Gravity’s Hidden Hand

“One of today’s great challenges in physics is to observe individual dark matter particles coming in from the galaxy and striking particles on Earth. This talk presents the evidence for dark matter and introduces one of the most ambitious efforts to discover interactions of dark matter particles, using tons of cryogenic liquid in a deep underground laboratory.”