Searching for the -gpu option

Print Friendly, PDF & Email

Jeff Layton over at Cluster Monkey is on the quest for the -gpu option — a quick way to take an existing code and optimize it for GPUs without a rewrite. He isn’t sure that we’ll ever get there either, but on the way you get to read his article, which I really enjoyed. Jeff starts off with some motivating remarks and a survey of the state of the field

I’m sure by now everyone has heard that you can run real code on GPU’s (Graphical Processing Units). GPUs are the graphics card in your desktop or even the graphic engines running your game consoles at home (never at work – right?). The potential performance improvement for codes or algorithms that can take advantage of the GPU’s programming model and do most of their computation on the GPU is enormous. There are cases of over a 100X performance improvement for some codes running on GPUs relative to CPUs.

…Fairly early on people realized that GPUs, while showing huge potential, were not going to have widespread adoption given that they were so difficult to write code for. So higher level languages were developed. There is a whole laundry list of languages and I won’t go over them here…

And then bridges into an interesting parallel between where we are now with GPUs and where we were when vector processors were just taking off. Then, real code. Jeff walks us through an example matrix multiplication in Fortran, then rewritten for CUDA, then re-implemented using the pragmas in PGI’s forthcoming technology preview.

So what is so special about this announcement from The Portland Group? I’m glad you have asked 🙂 What PGI has done is to add Pragmas or compiler directives to their compilers. These pragmas allow the compiler to analyze the code and generate GPU code that is then sent to the NVIDIA compiler (the Cuda compiler is part of the freely available CUDA software). The PGI compiler continues to compile the CPU based code and link it to CUDA built GPU binary. Then you execute the resulting combination.

Since we’re all geeks here (well, at least I am), let’s look at some details at how you code GPUs now and what PGI’s announcement does for us.

I’ve long since traded my geek cred for wicked powerpoint skills, but even I found the article helpful. Check it out.