Video: EUDAT and Big Data in Science

In this video from the 2013 National HPCC Conference, Wolfgang Gentzsch presents: EUDAT and Big Data in Science. Big data science emerges as a new paradigm for scientific discovery that reflects the increasing value of observational, experimental and computer-generated data in virtually all domains, from physics to the humanities and social sciences. Addressing this new […]

DDN – Big Data Evolution

In this video from the HPC Advisory Council Switzerland Conference, James Coomer from DDN presents: Big Data Evolution. DDN has developed a Hadoop solution that is all about time to value: It simplifies rollout so that enterprises can get up and running more quickly, provides typical DDN performance to accelerate data processing, and reduces the […]

Video: Accelerating Big Data with Hadoop (HDFS, MapReduce and HBase) and Memcached

In this video from the HPC Advisory Council Switzerland Conference, D.K. Panda from Ohio State University presents: Accelerating Big Data with Hadoop (HDFS, MapReduce and HBase) and Memcached. Download the slides (PDF).

Think of This: Most of the World’s Data is Unanalyzed

The idea that we use only 10 percent or less of our brain is one of those persistent myths that stubbornly refuses to go away. In reality, that bit of gelatinous grey matter between our ears has been extensively mapped and it appears that most of it has a function. So the notion that if […]

Podcast: The Big Data Revolution

In this podcast from the Leonard Lopate Show, Author Viktor Mayer-Schönberger explores how Big Data will affect the economy, science, and society at large. Big data” refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it. Big Data: A Revolution that Will Transform […]

Autotune – Supercomputer-assisted Calibration for Better Energy Models

One way to improve the energy efficiency of buildings is through energy models that simulate various aspects such as power, cooling, and heat loss through windows. Until now, however, building accurate models for diverse building designs has been very difficult. Over at NICS, Scott Gibson writes that supercomputer-assisted calibration methodology from Oak Ridge National Labs […]

HPC Industry Veteran John Kirkley Joins insideBigData Staff

It is my pleasure to announced that John Kirkley has joined our sister publication insideBigData as contributing editor. Most recently at the Digital Manufacturing Report, John is one of the most experienced HPC writers in the business and I’m really looking forward to working with him as we ramp up the publication in the coming […]

SDSC Fosters New BigData Top100 List

This week the San Diego Supercomputer Center (SDSC) announced plans for a community-based effort to create the BigData Top100 List, the first global ranking of its kind for systems designed for big data applications. The explosion in data and the value of repurposing and exploiting data assets have created what we now call the ‘big […]

Cray to Leverage Intel Hadoop Distro for Big Data Play in the Enterprise

Today Cray announced a Big Data solution combining the newly announced Intel Distribution for Apache Hadoop software with the Cray Xtreme line of supercomputers. And If you were wondering what Cray had in mind for its acquisition of Appro last year, this announcement may be showing the way. According to the company, the new offering […]

Video: PSC's Sherlock Supercomputer Means Business for Graph Computing

In this video, Dr. Nick Nystrom from PSC discusses what makes the Sherlock supercomputer unique and how businesses can take advantage of its graph computing prowess. Sherlock is a YarcData uRiKA (Universal RDF Integration Knowledge Appliance) data appliance with PSC enhancements. It enables large-scale, rapid graph analytics through massive multithreading, a shared address space, sophisticated […]