Our Ivy Bridge graphics tests show that with the introduction of Intel's Ivy Bridge processors (review), new PCs really don't need budget graphics cards anymore. Budget graphics cards are no longer required.

The most impressive feature of Intel's new Ivy Bridge CPU is the graphics portion of the chip. The HD 4000 GPU built into these processors is a huge improvement over the chips found in current-generation Sandy Bridge products, and is fast enough to make inexpensive entry-level graphics cards obsolete. You'll still want a good discrete graphics cards for serious gaming, but our benchmarks show that there's just no reason to buy a £50 graphics card anymore.

This stands in contrast to the processor's general compute performance, which is just slightly faster than current Intel CPUs. (For more, take a look at our testing of overall Ivy Bridge performance.) Ivy Bridge is more energy-efficient, which will be especially useful in laptops, but the most noticeable change in performance will be felt when you run 3D graphics applications.

Ivy Bridge graphics: Features, Speeds, and Feeds

At first sight the Ivy Bridge GPU might seem as if it would run slower than the Intel HD 3000 technology built into Intel's Sandy Bridge processors. The Sandy Bridge GPU runs at a base clock frequency of 850MHz, with a Turbo Boost clock as high as 1350MHz. Ivy Bridge's HD 4000 GPU, on the other hand, is 200MHz slower, operating at a base clock speed of 650MHz, with a Turbo Boost clock of 1150MHz.

Intel has made enhancements to the GPU engine to improve efficiency, but other factors help to mitigate the clock-rate differential, too. First, the new HD 4000 GPU contains 16 execution units, versus the 12 built into Sandy Bridge. Second, Ivy Bridge supports DDR3-1600 memory, as opposed to the Sandy Bridge memory controller, which officially supports only DDR3-1333. Ivy Bridge gains 25 percent more parallel compute power and higher potential throughput due to the added memory bandwidth.

Before diving into performance comparisons, it's worthwhile to examine additional Ivy Bridge GPU feature changes.

  • Full support for Microsoft's DirectX 11 API, including hardware tessellation and GPU compute: Intel claims that GPU-compute applications will run exclusively on the Ivy Bridge GPU if so instructed, and GPU-compute tasks won't be offloaded to the CPU.
  • Two texture units are present, as opposed to a single texture unit in Sandy Bridge.
  • The new compute shader enables greater data parallelism, and full support for Shader Model 5, which is required for DirectX 11.
  • A shared L3 cache is built into the GPU core itself, which reduces the need to fetch data from the ring bus and the CPU cache.
  • Support is included for up to three simultaneous displays (DVI, HDMI, or DisplayPort).
  • HDMI 1.4a, including high-bit-rate audio, is supported.
  • Quick Sync video is improved, and includes better support for Blu-ray stereoscopic 3D.

Those feature additions, along with the internal rearchitecting of the actual compute units, suggest that Ivy Bridge 3D graphics performance should be better than what you can get from other products. Let's take a look at actual benchmarks.

Ivy Bridge graphics: Performance Testing

We ran benchmarks in several configurations, but performed all of them on a common platform:

        Gigabyte Z77-UD3 motherboard
        Frame buffer memory for Intel HD Graphics set in the system BIOS to 1GB
        8GB DDR3-1600 memory for Core i7-3770K, DDR3-1333 for Core i7-2600K
        1TB, 7200-rpm Western Digital hard drive
        750W Antec Power High Current Pro power supply

We also followed certain procedures:

We ran all game benchmarks at 1080p resolution. We also set Unigine Heaven at 1080p, with normal tessellation enabled. We ran 3DMark Vantage and 3DMark 2011 in their “performance” mode. We ran game benchmarks on Ivy Bridge in DirectX 11 mode when available, but also ran them in DX9/10 modes on the same games for comparison.  We used an XFX Radeon HD 6450--an entry-level, DirectX 11-capable graphics card costing roughly $40--with the Ivy Bridge system, to get a feel for how the Intel HD 4000 might compare to an entry-level, discrete graphics card.

Ivy Bridge graphics: Synthetic Benchmarks

First up is 3DMark 2011, a DirectX 11-only graphics benchmark.

3dMark 2011 DX11

The Radeon HD 6450 fell far behind the Intel HD 4000. It was no contest, really.

Now it's on to another DirectX 11 synthetic test, Unigine Heaven 2.5.

Unigine Heaven DX11

Here, the Intel GPU actually lapped the Radeon HD 6450, achieving over double the frame rate at 1080p, with normal tessellation set. It's not a very high number, to be sure, but the results from both 3DMark 2011 and Unigine Heaven are solid proof that the Intel HD 4000 is DX11-capable.

Next, let's look at 3DMark Vantage, a DirectX 10 synthetic test.

Ivy Bridge test 3DMark Vantage

The Core i7-3770K has a 100MHz frequency advantage over the Core i7-2600K, but the GPU cores run at the same frequency across all CPUs with the same GPU core. So the 2600K's GPU still clocks at 850MHz and the 3770K's HD 4000 GPU still runs at 650MHz. However, the Ivy Bridge GPU posted a score that's almost 900 points higher. Meanwhile, the Radeon HD 6450 kept chugging along in a distant third.

Synthetic benchmarks are fine, but how did Intel's graphics technology fare on the real games in our tests? Read on.

NEXT PAGE: Ivy Bridge graphics - game performance >>