Big data changing the way we think

Print Friendly, PDF & Email

Michael Feldman has an interesting blog post at HPCwire this week on big data

Fresh from ISC’08 and the associated petaflop-mania, I noticed that the latest issue of Wired magazine has a series of articles on the ramifications of petabyte data. The issue is titled “The End of Science,” and the main thesis is that these enormous data sets are forcing us to rethink the way traditional science is performed.

…In the Wired piece titled “The Data Deluge Makes the Scientific Method Obsolete,” the author posits that when data reaches petabyte size, it’s not just more of the same. With such a quantity of data to from, researchers no longer need to bother with hypotheses to be tested; in fact, it’s often not practical to do so. Instead, statistical magic can be applied so that the data itself shapes the solution. For example, Google doesn’t “know” why one Web page is better than another; it just exposes the usage patterns. In a nutshell: correlation is in, models are out.

Its a thought-provoking post.

Comments

  1. and also very wrong. I was rather shocked to see how little Chris Anderson actually understands science. All these new data points, all this information does change the way we need to approach problems, but just as each generational shift creates changes in scientific thinking, techniques and sometimes philosophy, it only reinforces the scientific method. We still need sound models that can reconcile the known data (in this case a ton of data), the signals in that data, and our observed knowledge. I think Chris needs to spend time working with a few research labs to figure out what reality is and not just write for the sake of writing.