Deep Learning for Natural Language Processing – Choosing the Right GPU for the Job

Print Friendly, PDF & Email

Download the full report.

Machine translation of human languages has been a dream of researchers ever since computers powerful enough for the job became available. That was long before the internet became a thing, and smartphones appeared in everyone’s hand. But now, unlike fifty years ago, the techniques of artificial intelligence and deep learning, along with advances in hardware platforms and massively parallel systems, have put the capabilities of quality language translation at everyone’s fingertips.

We’re no longer talking about restricted vocabulary and syntax, but full every day speech, say, from English to German. And not just for translating academic papers, but coupled with speech recognition in real time on a tourist’s cellphone!

Making this possible are the latest developments in neural networks and deep learning systems, and, in particular, a neural network architecture called transformers. Researchers have shown that transformer networks are particularly well suited for parallelization on GPU-based systems. These networks outperform traditional machine translation Introduction models and are very capable of producing high-quality translations.

The problem is that transformer networks require very large amounts of GPU memory, well beyond what you find in most entry level deep learning platforms. Systems based on consumer grade GPUs, such as the commonly used NVIDIA GeForce RTX 2080 Ti, will quickly run out of memory with the batch sizes required for high quality results. While well suited for other parallel processing applications, such as computer vision and image processing, consumer grade GPUs cannot provide the amount of unified GPU memory that transformer models need to achieve the highest quality translations.

But how much GPU memory? Which GPU models should you choose? And which training parameter settings give the best quality translations?

Even with the latest advances in training transformer models, little has been published regarding the GPUs that should be used for this task. And that’s what we’ll discuss in this whitepaper.

Download the new report, courtesy of Exxact, Deep Learning for Natural Language Processing – Choosing the Right GPU for the Jobto help make your journey to deep learning fast and smooth.