Sign up for our newsletter and get the latest HPC news and analysis.

Henry Newman on Why Backblaze is Still Wrong About Disk Reliability

Brett Favre Press Conference

Over at Enterprise Storage Forum, Henry Newman comes out of the shortest retirement period since Brett Favre in order to take Backblaze to task on their recent disk reliability study. “With a known lower hard error rate, why would Backblaze use consumer drives for an enterprise application? Maybe they do not think their users’ backup data is that important and believe that all that is needed is consumer drives. Nothing has changed from last year; there is no comparison of consumer and enterprise drives.”

Radio Free HPC Sorts Through the News Grab Bag

bubble

In this episode, the Radio Free HPC team digs into the Grab Bag for Topics of the Week. Dan attended the Lenovo Analyst Conference, and they have him convinced that the company is Going Big on HPC. Rich notes that D-Wave Systems has just landed an additional $29 Million in financing. Is Quantum Computing ready for Prime Time? Finally, Henry is looking forward to seeing what the President’s science priorities are going to be when his budget comes out this week.

Glenn Lockwood on the NSF Future Directions Interim Report

Glenn Lockwood

Glenn K. Lockwood writes that a recent interim report on NSF Advanced Computing raises big concerns about the future of HPC at the National Science Foundation. With NSF funding currently under fire in Congress, I’m thinking concerned taxpayers should give this a good read.

Radio Free HPC Compares Providers at CloudHarmony

bubble

In this podcast, the Radio Free HPC team looks at CloudHarmony, an online service that measures and compares cloud provider uptime. “At CloudHarmony, we simplify the comparison of cloud services by providing reliable and objective performance analysis, reports, commentary, metrics and tools.”

New Exascale Strategies Go Beyond HPC Market

Tom Wilkie, Scientific Computing World

“While HPC is being pulled in this direction by external market forces, it became clear at the US Supercomputing Conference, SC14, held this year in New Orleans in late November, that the technologies underpinning technical high-performance computing are now changing in response. Paradoxically, the announcement of the largest US Government investment in technical supercomputing for many years will transform business computing.”

Radio Free HPC Looks at Modding the Student Cluster Competitions

bubble

“Henry wants to codify rules that span all competitions in order to provide a level playing field and to satisfy his authoritarian nature. Dan isn’t so sure that would work, given that each of the sponsoring organizations have their own ideas about how to best run a competition. However, both of them believe that the competitions need to become more real world when it comes to systems, applications, and how they’re used. One of the first steps along this road, the guys agree, is to add a storage component to the competitions.”

Radio Free HPC Looks at Network Security in Light of 2014 Data Breaches

Linda Millis, Senior VP for Business Development, Sol-Pass

In this episode, the Radio Free HPC team looks at Network Security in light of the recent series of breaches at places like Sony, Home Depot, and Target. Our special guest this week is Security Expert Linda Millis, who has a day job as Senior VP of Sol-Pass.

Funding the March Towards Exascale

Robert Roe

In this special guest feature from Scientific Computing World, Robert Roe writes that recent DOE funding for high end supercomputing bodes well for the continuing march to Exascale levels of computation.

A Look at 2014 as the Year in Parallel Computing

phi-compressor

Over at the Go Parallel Blog, John O’Donnell has posted highlights of the year 2014 in parallel computing.

Hyped Technologies that Won’t Shine in 2015

Enrico Signoretti

“100% Flash in the Datacenter? It won’t happen any time soon. Many (most?) tier one workloads will be moved to flash of course, but data is adding up so quickly that it’s highly unlikely you will be seeing a 100% datacenter any time soon. It will take a few years to have about 10/20% of data stored on flash and the rest will remain on huge hard disks (cheap 10+TB hard disks will soon be broadly available for example).”