MailChimp Developer

Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:

Video: Big Data Powers Climate Research at BSC


In this video from the Barcelona Supercomputer Center, Big Data is presented as a key challenge for researchers studying global climate change. “Changes in the composition of the atmosphere can affect the habitability of the planet by modifying the air quality and altering long-term climate. Research in this area is devoted to the development, implementation and refinement of global and regional state-of-the-art models for short-term air quality forecasting and long-term climate predictions.”

Accelerating Science with SciDB from NERSC

SciDB harnesses parallel architectures for fast analysis of terabyte (TBs) arrays of scientific data. This collage illustrates some of the scientific areas that have benefited from NERSC's implementation of SciDB, including astronomy, biology and climate. (Image Credit: Yushu Yao, Berkeley Lab)

Over at NERSC, Linda Vu writes that the SciDB open source database system is a powerful tool for helping scientists wrangle Big Data. “SciDB is an open source database system designed to store and analyze extremely large array-structured data—like pictures from light sources and telescopes, time-series data collected from sensors, spectral data produced by spectrometers and spectrographs, and graph-like structures that illustrate relationships between entities.”

Intel Invests in BlueData for Spinning Up Spark Clusters on the Fly


Today Intel Corporation and BlueData announced a broad strategic technology and business collaboration, as well as an additional equity investment in BlueData from Intel Capital. BlueData is a Silicon Valley startup that makes it easier for companies to install Big Data infrastructure, such as Apache Hadoop and Spark, in their own data centers or in the cloud.

A Systems View of Machine Learning


“Despite the growing abundance of powerful tools, building and deploying machine-learning frameworks into production continues to be major challenge, in both science and industry. I’ll present some particular pain points and cautions for practitioners as well as recent work addressing some of the nagging issues. I advocate for a systems view, which, when expanded beyond the algorithms and codes to the organizational ecosystem, places some interesting constraints on the teams tasked with development and stewardship of ML products.”

PSC Retires Blacklight Supercomputer to Make Way for Bridges


The big memory “Blacklight” system at the Pittsburgh Supercomputer Center will be retired on Aug 15 to make way for the new “Bridges” supercomputer. “Built by HP, Bridges will feature multiple nodes with as much as 12 terabytes each of shared memory, equivalent to unifying the RAM in 1,536 high-end notebook computers. This will enable it to handle the largest memory-intensive problems in important research areas such as genome sequence assembly, machine learning and cybersecurity.”

Podcast: HP & Intel Accelerate HPC Solutions

Bill Mannel, VP & GM of HPC & Big Data Business Unit at HP

In this Intel Chip Chat podcast, Bill Mannel from HP stops by to discuss the growing demand for high performance computing solutions and the innovative use of HPC to manage big data. He highlights an alliance between Intel and HP that will accelerate HPC and big data solutions tailored to meet the latest needs and workloads of HPC customers, leading with customized vertical solutions.

Materials Imaging and Data Sciences Converge at Oak Ridge Workshop


Oak Ridge National Lab recently hosted a Materials Imaging Workshop. Entitled “Big, Deep and Smart Data Analytics in Materials Imaging,” the workshop explored research opportunities and challenges arising as imaging and data sciences merge.

Lustre* at the Core of HPC and Big Data Convergence

HPC BIGDATA Convergence

Companies already using High-performance Computing (HPC) with a Lustre file system for simulations, such as those in the financial, oil and gas, and manufacturing sectors, want to convert some of their HPC cycles to Big Data analytics. This puts Lustre at the core of the convergence of Big Data and HPC.

Video: HP Moves Forward with New Business Unit for HPC & Big Data

Bill Mannel, VP & GM of HPC & Big Data Business Unit at HP

“As a result of a new alliance with Intel, HP is offering its HPC Solutions Framework based on HP Apollo servers, which are specialized for HPC and now optimized to support industry- specific software applications from leading independent software vendors. These solutions will dramatically simplify the deployment of HPC for customers in industries such as oil and gas, life sciences and financial services. The HP Apollo product line integrates Intel’s technology innovation from its HPC scalable system framework, which helps to extend the resilience, reliability, power efficiency and price/performance of the HP Apollo solutions.”

Job of the Week: Director of Sales at Ryft


Ryft in San Francisco is seeking a Director of Sales in our Job of the Week.