According to their web site, Pervasive DataRush has a simple mission: Simplify How You Process and Analyze Big Data. The company’s product is a parallel dataflow platform that eliminates performance bottlenecks in big data preparation and analytics.
Pervasive Datarush set a remarkable performance record recently on an SGI Altix system with 384 cores. These types of benchmarks are measured in CUPS (cell updates per second) and the company was able to achieve nearly one TeraCUPS of performance on the Smith-Waterman algorithm, which is popular in bioinformatics.
In this podcast, I catch up with Davin Potts to talk about how Pervasive Datarush leverages the parallel processing capabilities of multicore processors and SMP systems to deliver extreme performance on big data. Download the MP3 or subscribe on iTunes.