In modern computers the Graphics Processing Unit (GPU) is responsible for everything you see on your screen.

Both Windows Vista and 7 can use the GPU to render the desktop, taking advantage of 3D acceleration features to provide smooth window movement, transparency, and other visual effects. Naturally, the 3D graphics in games, educational titles, and such are all rendered by the GPU.

Modern GPUs even have special video processing units that decode, scale, and de-interlace the most popular video formats, improving quality and reducing CPU load and power consumption. GPUs are even starting to be used as highly parallel processors to do certain very math-heavy tasks much faster than the CPU, though this technology is in its infancy.

But beyond that, it gets a little confusing. There are dozens of brand names and model numbers out there, and a whole alphabet soup of buzzwords and acronyms that seem specifically designed to confuse the average customer.

Let's look at some common graphics-related terms and what you should look for when shopping for a graphics card (or choosing which graphics card should be in your next computer or laptop).

nVidia, ATI and Intel

Today, there are three major players in the graphics market. There's nVidia, a company that focuses almost entirely on graphics products. A few years ago the CPU and chipset maker AMD bought Canadian graphics developer and nVidia competitor ATI.

You'll still see the ATI brand quite often; AMD kept it around for their graphics division. Finally there's Intel, which currently only makes integrated graphics products built into the motherboard chipsets for their processors.

Soon, Intel will start shipping processors with graphics integrated right into the CPU. There are other graphics companies out there, but they either focus on devices like mobile phones or have such a tiny piece of the market that they're not worth bringing up.

Which one should you use? This is a point of much contention among graphics fans and gamers. To be honest, nVidia and ATI/AMD both make excellent products and have drivers that are, on the whole and over time, roughly comparable in terms of stability. If you want a discrete graphics card, you should pick whichever one is best at the price you want.

Intel's integrated graphics is what you get when you don't make a choice, basically. Though it has improved greatly over the years, it is still slower than the integrated graphics options from nVidia and ATI, and far slower than discrete graphics solutions.


DirectX is an API (Application Programming Interface - a set of conventions and abstractions that let programmers control a piece of hardware like a GPU). DirectX actually contains lots of pieces to deal with things like audio and such, but the part that deals with 3D graphics is called Direct3D.

On Windows, DirectX is by far the most common way that games make use of the GPU, but because it comes from Microsoft and makes use of the Windows driver stack, it's only on Windows.

Windows Vista and 7 support DirectX 10.1 as the latest version, and DirectX 11 is coming to both Windows 7 and Vista very soon. With it comes a few exciting new features. We'll get to that in a minute.

Laptop buying advice

See all laptop reviews

NEXT PAGE: What happens when you're not on Windows

  1. The issues you need to consider when purchasing
  2. What happens when you're not on Windows
  3. CUDA and ATI Stream
  4. SLI and Crossfire

If you're confused by the wealth of choice when it comes to graphics cards, then fear not. We've looked at the common issues, and our guide should help you decide which card is right for you.


If you're not on Windows, odds are that programmers are accessing 3D hardware through an API called OpenGL.

This standard graphics API is controlled by a collaborative entity called the Khronos Group, which has members from lots of big software and hardware makers. OpenGL is available and used on Windows (in fact, the newest version of Photoshop uses it for GPU acceleration), but it isn't as common as Direct3D.

These days, all modern GPUs (discrete and integrated) provide both OpenGL and DirectX drivers.


Remember when I mentioned that GPUs can be used for general computing (like video format conversions, heavy scientific calculations, and such)? Well, OpenCL is a standardised way of doing this.

An OpenCL program can run on and be accelerated by the GPU, regardless of who the GPU manufacturer is. It's a brand new standard, appearing in both Apple's new Snow Leopard OS and Windows (XP, Vista, and 7).

Neither nVidia nor ATI have real, final, public OpenCL drivers yet. This is a technology that is in its infancy, but should grow rapidly. Robust OpenCL support and good performance will probably be a real selling point in the next year or two.

Drivers, drivers, drivers

No matter what graphics processor you have, you need the latest drivers. For nVidia cards, go here. For ATI cards, go here. For Intel integrated graphics, go here. If you have a laptop, you may need to go to your laptop manufacturer's website to get the latest drivers.

DirectX 11

Microsoft's marketing department is doing its best to brand DirectX 11 as a Windows 7 thing, but the truth is that it's coming to Vista as well. This new version of the API brings with it several new features. It's too much to go into here, but the short list is:

  • Better use of multi-core CPUs
  • Tessellation - This is the fancy word for breaking up an object made of a small number of triangles (and thus blocky-looking) into a very large number of triangles, which can then be manipulated to make the object look smoother or more detailed.
  • DirectCompute - (aka 'Compute Shaders') - Like OpenCL, this is a standardised way to make and GPU with DirectX 11 drivers do general computational stuff.

Laptop buying advice

See all laptop reviews


  1. The issues you need to consider when purchasing
  2. What happens when you're not on Windows
  3. CUDA and ATI Stream
  4. SLI and Crossfire

If you're confused by the wealth of choice when it comes to graphics cards, then fear not. We've looked at the common issues, and our guide should help you decide which card is right for you.

CUDA and ATI Stream

For the past several years, both nVidia and ATI have been working on using the GPU for general computing tasks. It's hard to launch a new software industry. Each company has its own proprietary means of programming its graphics products.

nVidia's is called CUDA, ATI's is called ATI Stream. CUDA is more popular, but it's still mostly stuck in the ‘big iron' high performance computing and academic fields, with only a handful of real consumer apps.

New programming models, such as using the GPU for general computing tasks, tend to take off when standards emerge, so the real action will probably be in OpenCL and DirectX 11 Compute Shaders. Don't let CUDA or ATI Stream influence your buying decisions too much.
Future Hardware: nVidia, ATI, and Intel's Larrabee

Both ATI and nVidia are getting their new DirectX 11 class graphics products ready to roll. ATI appears to be a few months ahead of nVidia on their rollout. If the rumors are to be believed, the company should have a top-to-bottom lineup in the next month or two. nVidia may only have high-end chips at first, at then only at the end of the year or possibly early next year.

Unfortunately, we can't tell you which one is the better buy because we don't really know about their price, performance, power utilisation, or any of that other stuff. But if you don't desperately need a new graphics card right now, you might want to wait a few months and see how this new generation of products looks.

Meanwhile, Intel is preparing a novel new product with the code-name Larrabee. This will be a GPU first appearing in a high-end discrete graphics card rather than the typical integrated graphics stuff we see from Intel.

It doesn't follow the traditional graphics chip architecture, but is rather a chip full of lots of very compact x86 CPUs (like the Atom chip for netbooks) that have very wide vector processing units and a specialised set of programming instructions.

This makes the chip very flexible, and it should be great for GPU compute type applications, but will it be a fast graphics chip? Nobody knows. What we do know is that Intel is a year ahead of everyone else on chip manufacturing technology and should never be underestimated.

Laptop buying advice

See all laptop reviews

NEXT PAGE: SLI and Crossfire

  1. The issues you need to consider when purchasing
  2. What happens when you're not on Windows
  3. CUDA and ATI Stream
  4. SLI and Crossfire

If you're confused by the wealth of choice when it comes to graphics cards, then fear not. We've looked at the common issues, and our guide should help you decide which card is right for you.

SLI and Crossfire

These are terms for nVidia (SLI) and ATI (Crossfire) technologies to use more than one GPU at a time for higher performance. Should you get it? Generally speaking, this is one of those ‘if you have to ask, the answer is no' sort of technologies.

You can expect a second GPU to add maybe 50 to 80 percent performance over the first, and from there the performance gains are minimal. The third GPU only gets you maybe 30 percent more, and the fourth (yes, you can do a four-GPU system!) barely improves things over the third at all.

Enthusiast gamers with very big, high-resolution monitors are the target market for multi-GPU solutions. If this is you, you might want to consider SLI or Crossfire.

You'll need a motherboard with two graphics slots that supports SLI/Crossfire, but these are not uncommon. Odds are, most of you reading this article probably aren't the target market for this.

Discrete or integrated?

Okay, it's time to make a buying decision. Do you go with a discrete graphics card (in either a desktop or laptop) or integrated graphics? If you want to play games, even just a bit, you'll have a far better experience with discrete graphics.

If all you want to do is browse the web and do some light word processing or email, integrated is probably enough. Intel's integrated graphics isn't as good as nVidia's or ATI's, and if you care about the quality of the video (watching DVDs or downloaded video on your PC), you want an nVidia or ATI graphics chip.

If battery life is your top concern, avoid discrete graphics and go with integrated.

How much should I spend?

You should probably not spend less than £80 or so on a graphics card. Cards in the £80 to £150 range are good value and can run almost all modern games very well. Once you start spending less than that, the performance drops rapidly and you'll just need to upgrade sooner.

If you or someone who uses the computer is a more serious gamer, look for cards in the £150- £250) price range. You really don't need to spend more than that if you're reading this article.

How much memory do I need?

You'll see a lot of cheap graphics cards with 1GB of memory on them. This is mostly a waste of money. In the £80 range, there isn't much benefit to having more than 512MB of memory. A faster GPU chip on the card is worth more than a bigger amount of memory.

Once you get to the £150-and-up range, you want a card with 1GB of RAM. If it's integrated graphics, it'll use your main system memory and you don't need to worry about it (this memory sharing is one of the reasons integrated graphics are so slow).

My recommended picks

Low-cost option: Radeon HD 4850

Enthusiasts: Radeon HD 4890

Expensive: GeForce GTX 285 or Radeon HD 4870 X2

See also: Group test: top 12 ATI & nVidia graphics cards

Laptop buying advice

See all laptop reviews

  1. The issues you need to consider when purchasing
  2. What happens when you're not on Windows
  3. CUDA and ATI Stream
  4. SLI and Crossfire