Sign up for our newsletter and get the latest big data news and analysis.

RPI Computer Scientist Wins NSF Award to Match Exascale Systems with Petascale Data Volumes

Rensselaer Polytechnic Institute faculty member and computer scientist George Slota has been granted a National Science Foundation Faculty Early Career Development (CAREER) award to work on the problem of enabling exascale-class system to handle gigantic, petascale-class volumes of data. “How do we best understand and get insight from this kind of data? To do that, […]

Video: High Performance Clustering for Trillion Particle Simulations

“Modern Cosmology and Plasma Physics codes are capable of simulating trillions of particles on petascale systems. Each time step generated from such simulations is on the order of 10s of TBs. Summarizing and analyzing raw particle data is challenging, and scientists often focus on density structures for follow-up analysis. We develop a highly scalable version of the clustering algorithm DBSCAN and apply it to the largest particle simulation datasets. Our system, called BD-CATDS, is the first one to perform end-to-end clustering analysis of trillion particle simulation output. We demonstrate clustering analysis of a 1.4 Trillion particle dataset from a plasma physics simulation, and a 10,240^3 particle cosmology simulation utilizing ~100,000 cores in 30 minutes. BD-CATS has enabled scientists to ask novel questions about acceleration mechanisms in particle physics, and has demonstrated qualitatively superior results in cosmology. Clustering is an example of one scientific data analytics problem. This talk will conclude with a broad overview of other leading data analytics challenges across scientific domains, and joint efforts between NERSC and Intel Research to tackle some of these challenges.”

Benchmarks, Schmenchmarks: Big Iron Data and the November 2015 TOP500

In this special guest feature, Peter ffoulkes from OrionX offers his insights on the latest TOP500 listing of the world’s fastest supercomputers. “Most importantly increasing the number of petascale-capable resources available to scientists, researchers, and other users up to 20% of the entire list will be a significant milestone. From a useful outcome and transformational perspective it is much more important to support advances in science, research and analysis than to ring the bell with the world’s first exascale system on the TOP500 in 2018, 2023 or 2025.”

Simulating Geomagnetic Storm Effects on Power Grids

“Using Blue Waters, we are for the first time running highly detailed, global simulations of the Earth-ionosphere waveguide under the effect of a geomagnetic storm. Disturbed ionospheric currents are modeled in a three-dimensional Maxwell’s equations finite-difference time-domain (FDTD) model extending from -400 km to an altitude of 400 km.”

Magnus Supercomputer Powers Petascale Pioneers Down Under

HPC matters in Australia, where the Pawsey Supercomputing Centre’s Petascale Pioneers program is attracting the world’s best researchers with the Magnus supercomputer. As the most advanced scientific supercomputer in the Southern Hemisphere, Magnus is a Petascale Cray XC30 machine with over 35,000 cores using Intel Xeon E5-2600 v3 processors and 95 TB of memory.

This Week in HPC: High-Frequency HPC and Japan Space Agency Preps for Petaflops

In this episode of This Week in HPC, Michael Feldman and Addison Snell from Intersect360 Research discuss high frequency trading and the High Performance on Wall Street Conference. After that, they look at a Fujitsu’s Petascale upgrade at the Japan Aerospace Exploration Agency. The novel architecture for this Sparc-powered FX10 follow-on system will feature Micron’s Hybrid Memory Cube technology.