Growing use of GPUs in life sciences

Our pal Joe Landman has a starring role in this article from GenomeWeb.com on the growing importance of GPUs in life science work

While there is never any shortage of vendors touting their hardware acceleration tool as the silver bullet for your computing bottlenecks, it’s important to keep in mind that it is not the hardware, but the algorithm and the data set that should inform your choice, along with cost and ease of use. The fact that GPU chipmaker NVIDIA has made porting code for GPUs easier for the average bench biologist with its CUDA software technology helps the argument for considering this breed of acceleration technology.

One of the things that strikes me as fundamentally different in life science work from the other computational fields I’ve been connected with is that the field actively encourages, through the open publication of genome and other data sets, anyone to join in the research. In some fields of life science study where you are concerned with pattern discrimination and matching, all you need is Perl, a laptop, and a network connection to play an active role in the research community.

This is a wonderful democratization of the research effort, and probably means that the speed of innovation and quality of invention in this field will dramatically outpace the other, higher barrier to entry fields that are dominated by the traditional large (rich) research institutions. Another application of my million monkeys coding postulate.

Of course the phrase “democratization” in computing also implies the phrase “I’m broke,” or at least “I’m not rich.” GPUs have disadvantages (programming), and they aren’t right for every job, but what they are is a cheap, low power source of compute. It makes sense that they would be enjoying increased attention in this field. And I think that because so many small scale teams and individual researchers are contributing here that there will probably be a noticeable network effort that helps GPU adoption — if Suzy Q is using them and having great results, then Johnny X will probably perceive them as a lower barrier path to entry and use them as well. I don’t perceive there is a similar network effect with either clusters or high end supers, probably because individuals don’t make these decisions, departments and institutions do, and they have different motivations other than just getting work done.

Comments

  1. Does Intel adopt the GPUs algorithm like nVIDIA and ATI graphics as well?

Resource Links: