Sign up for our newsletter and get the latest HPC news and analysis.

Realizing Dreams: Big Data & Computing in Very Large Projects & Companies

In this video from the 2013 XLDB conference, panelists Richard Mount from SLAC, Jeff Dean from Google, and Greg Papadopoulos from New Enterprise Associates take audience questions on Realizing Dreams: Big Data & Computing in Very Large Projects & Companies.

The theme of this year’s conference was how “big data” projects are created. Speakers gave perspectives from industry and science as well as from funding sources, such as the National Science Foundation and venture capitalists.

There is still a gulf between scientific and industrial projects,” Becla said. “Government funding agencies are understandably conservative when funding huge science projects, such as accelerators or telescopes.” They often lock in technology early in a project’s development. Commercial data projects, on the other hand, have much shorter development time scales and often must be able to change to accommodate rapidly changing needs. Google and Facebook, for example, use open source databases so they can take advantage of enhancements created by their wide-ranging user communities. It can be a headache to keep up. “Every year we see last year’s best solutions fading away, superseded by newer ones,” Becla said.

In what I find as a rather surprising conclusion, Richard Mount says that they have become very good at building giant machines that have never been built before, but producing the required power supplies remains a struggle.

Read the Full Story or check out more talks from the conference at YouTube.

Resource Links: