Why Does Hadoop Have Such an Uncomfortable Fit in HPC?

Print Friendly, PDF & Email

glennOver at Glenn K. Lockwood’s Blog, Glenn looks at why Hadoop remains at the fringe of High Performance Computing and what it might take for it to be considered a serious HPC solution.

The evolution of Hadoop has very much been a backwards one; it entered HPC as a solution to a problem which, by and large, did not yet exist.  As a result, it followed a common, but backwards, pattern by which computer scientists, not domain scientists, get excited by a new toy and invest a lot of effort into creating proof-of-concept codes and use cases.  Unfortunately, this sort of development is fundamentally unsustainable because of its nucleation in a vacuum, and in the case of Hadoop, researchers moved on to the next big thing and largely abandoned their model applications as the shine of Hadoop faded (see sidebar).  This has left a graveyard of software, documentation, and ideas that are frozen in time and rapidly losing relevance as Hadoop moves on.

Read the Full Story.