HPC and Beer have always had a certain affinity ever since the days when Cray Research would include a case of Leinenkugel’s with every supercomputer. Now, Brian Caulfield from Nvidia writes that a Pennsylvania startup is using GPUs and Deep Learning technologies to enable brewers to make better beer.
Using “free beer” as a draw, Jason Cohen from Analytical Flavor Systems has accumulated a “trove of data” that lets him tease out 20 common flaws in a beer with just a handful of tastings. Drinkers record impressions on 25 factors on their smart phones. To handle that data, Cohenʼs 11-employee team began experimenting with GPUs, which allowed them to speed up the analysis of data gathered from tasters by threefold. And because Amazon hosts GPU-accelerated servers, the team can just rent access to the GPUs they need.
Thanks to GPUs, his companyʼs Gastrograph software can now identify dozens of obscure beer styles — Vienna lagers, Irish dry stouts or Berliner Weissbiers — in seconds, rather than minutes. Thatʼs crucial to detecting bad beer. Buttery diacetyl, for example, improves the thick, creamy body of dark porters and stouts. But itʼs a fatal flaw in a crisp lager marketed to millions. Cohen’s using GPUs for more than just classifying beers. He’s using them to create models that help analyze profiles generated by tasters against the more than 100,000 beer reviews his company has collected. Without the parallel architecture of GPUs, for example, it took Cohen’s team a long time to train deep neural networks with many layers, or random forest models with many trees. Cohen’s team now uses NVIDIA’s CUDA toolkit in R — such as gputools and gmatrix — to boost performance. Now model tuning only takes minutes to complete.