Sign up for our newsletter and get the latest HPC news and analysis.

Intel Details Future Graphics Silicon

On Friday of last week, Intel decided to unveil more details on its upcoming graphics chip, Larrabee.  Tom Forsyth, software and hardware engineer at Intel, spoke at the Game Developers Conference in San Francisco.  According to Forsyth, Larrabee will be Intel’s first many-core architecture.  It will look and behave much like a GPU, but will contain [x86] processor cores rather than pure GPU cores.

It’s based on a lot of small, efficient in-order cores. And we put a whole bunch of them on one bit of silicon. We join them together with very high bandwidth communication so they can talk to each other very fast and they can talk to off-chip memory very fast and they can talk to other various units on the chip very fast.” [Tim Forsyth, Intel]

Larrabee Details

Larrabee Details

Forsyth goes on to discuss the centerpiece of what will become Larrabee, the vector unit.  Larrabee will include a SIMD vector unti with associated vector instructions.  Indepedent of its host scalar unit, each vector core can perform up to 16 operations per clock cycle.  Consider having 32 of these cores inside a single Larrabee.  One could achieve a peak throughput of 512 operations per clock cycle, indepedent of the scalar unit.  Even at a low clock rate, the overall peak capability of Larrabee is at the least interesting.

A funny thing happened on the way to the architecture. We designed this architecture to be 100 percent graphics focused. Whatever we needed to do to get graphics good, we did. And then a year ago, we looked at what we had and said how much of this stuff is actually specific to graphics. It turns out, very little. Graphics workloads are increasingly similar to GPGPU (general-purpose graphics processor unit), increasingly similar to high-powered (high-performance) computing. So, we actually have very little that is specific to graphics. Most of the instruction set is very general-purpose.” [Tim Forsyth]

This is very exciting stuff from Intel.  For more info, read the full article here.

Resource Links: