MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Accelerating Science with SciDB from NERSC

SciDB harnesses parallel architectures for fast analysis of terabyte (TBs) arrays of scientific data. This collage illustrates some of the scientific areas that have benefited from NERSC's implementation of SciDB, including astronomy, biology and climate. (Image Credit: Yushu Yao, Berkeley Lab)

Over at NERSC, Linda Vu writes that the SciDB open source database system is a powerful tool for helping scientists wrangle Big Data. “SciDB is an open source database system designed to store and analyze extremely large array-structured data—like pictures from light sources and telescopes, time-series data collected from sensors, spectral data produced by spectrometers and spectrographs, and graph-like structures that illustrate relationships between entities.”

NASA Charts Sea Level Rise

NASA visualization shows shifts in the Gulf Stream (in Blue). Yellow shows drops in Sea Levels while Orange shows increases.

“Sea level rise is one of the most visible signatures of our changing climate, and rising seas have profound impacts on our nation, our economy and all of humanity,” said Michael Freilich, director of NASA’s Earth Science Division. “By combining space-borne direct measurements of sea level with a host of other measurements from satellites and sensors in the oceans themselves, NASA scientists are not only tracking changes in ocean heights but are also determining the reasons for those changes.”

Rescale Launches Cloud HPC Platform in Europe

scalex

Today Rescale announced availability of its Europe region simulation and HPC platforms. As an HPC cloud provider, Rescale offers a software platform and hardware infrastructure for companies to perform scientific and engineering simulations.

Intel Invests in BlueData for Spinning Up Spark Clusters on the Fly

bluedata

Today Intel Corporation and BlueData announced a broad strategic technology and business collaboration, as well as an additional equity investment in BlueData from Intel Capital. BlueData is a Silicon Valley startup that makes it easier for companies to install Big Data infrastructure, such as Apache Hadoop and Spark, in their own data centers or in the cloud.

From Grand Challenges to Critical Workflows

Grand-Challenge-Blog-Part1

Geert Wenes writes in the Cray Blog that the next generation of Grand Challenges will focus on critical workflows for Exascale. “For every historical HPC grand challenge application, there is now a critical dependency on a series of other processing and analysis steps, data movement and communications that goes well beyond the pre- and post-processing of yore. It is iterative, sometimes synchronous (in situ) and generally more on an equal footing with the “main” application.”

SDSC Gets One-year Extension for Gordon Supercomputer

hpc_gordon_body

The National Science Foundation has awarded the San Diego Supercomputer Center (SDSC) a one-year extension to continue operating its Gordon supercomputer, providing continued access to the cluster for a wide range of researchers with data-intensive projects.

Intel Updates Developer Toolkit with Data Analytics Acceleration Library

Int_DPD_TEC_ProductGraphic_PSXE

Today Intel released Intel Parallel Studio XE 2016, the next iteration of its developer toolkit for HPC and technical computing applications. This release introduces the Intel Data Analytics Acceleration Library, a library for big data developers that turns large data clusters into meaningful information with advanced analytics algorithms.

Dell Posts Agenda for La Jolla Genomics Data Workshop

tp12

Dell has posted the Agenda for their upcoming Genomics Data Workshop. Entitled “Enabling discovery and product innovation with Dell HPC and Big Data Solutions,” the event will take place Sept. 15 in La Jolla, CA.

Video: Democratizing Data Science

Cdsw_combo_images-1

“CDSW’s organizers are professional programmers and data scientists and several of us have experience teaching data science in more traditional university and corporate settings. Our talk will describe how “democratized” data science is similar to — and sometimes extremely different from — these more traditional approaches. We will talk about some of the challenges we have faced and highlight some of our most inspirational successes.”

Pushing the Boundaries of Combustion Simulation with Mira

mira

“Researchers at the U.S. Department of Energy’s Argonne National Laboratory will be testing the limits of computing horsepower this year with a new simulation project from the Virtual Engine Research Institute and Fuels Initiative (VERIFI) that will harness 60 million computer core hours to dispel those uncertainties and pave the way to more effective engine simulations.”