Tuesday, May 17, 2011

Sandy Bridge Graphics Disappoints


See update and the end of this post: New drivers released.
Well, I'm bummed out.

I was really looking forward to purchasing a new laptop that had one of Intel's new Sandy Bridge chips. That's the chip with integrated graphics which, while it wouldn't exactly rock, would at least be adequate for games at midrange settings. No more fussing around comparing added discrete graphics chips, fewer scorch marks on my lap, and other associated goodness would ensue.

The pre-ship performance estimates and hands'-on trials said that would be possible, as I pointed out in Intel Graphics in Sandy Bridge: Good Enough. This would have had the side effect of pulling the rug out from under Nvidia's volumes for GPUs, causing the HPC market to have to pull its own weight, meaning have traditional HPC price tags (see Nvidia-based Cheap Supercomputing Coming to an End). That would have been an earthquake, since most of the highest-end HPC systems now get their peak speeds from Nvidia CUDA accelerators, a situation not in small part due to their (relatively) low prices arising from high graphics volumes.

Then TechSpot had to go and do a performance comparison of low-end graphics cards, and later, just as a side addition, throw in measurements of Sandy Bridge graphics, too.

Now, I'm sufficiently old-fashioned in my language that I really try to avoid even marginally obscene terms, even if they are in widespread everyday use, but in this case I have to make an exception:

Damn, Sandy Bridge really sucks at graphics.

It's the lowest of the low in every case. It's unusable for every game tested (and they tested quite a few), unless you're on some time-dilation drug that makes less than 15 frames per second seem zippy. Some frame rates – at medium settings – are in single digits.

With Sandy Bridge, Intel has solidly maintained its historic lock on the worst graphics performance in the industry. This, by the way, is with the Intel i7 chips overclocked to 3.4GHz. That should also overclock the graphics (unless Intel is doing something I don't know about with the graphics clock).

Ah, but possibly there is a "3D" fix for this coming soon? Ivy Bridge, the upcoming 22nm shrink of Sandy Bridge (the Intel "tock" following Sandy Bridge "tick"), has those wondrous new much-promoted transistors. Heh. Intel says Ivy Bridge will have – drum roll – 30% faster graphics than Sandy Bridge.

See prior marginal obscenity.

Intel does tend to sandbag future performance estimates, but not by enough to lift 30% up to 200-300%; that's what would be needed to produce what people were saying Sandy Bridge would do. Is that all we get from those "3D" transistors? The way the Intel media guys are going on about 3D, I expected Tri-Gate (which can be two- or five- or whatever-gate) to give me an Avatar-like mind meld or something.

All that stuff about on-chip integrated graphics taking over the low-end high-volume market for discrete graphics just isn't going to happen this year with Sandy Bridge, or later with Ivy Bridge. As a further grain of salt in my wound, Nvidia is even seeing a nice revenue uptick from selling discrete graphics add-ons to new Sandy Bridge systems. It's not that I have anything against Nvidia. I just didn't think that uptick, of all things, was going to happen.

This doesn't change my opinion that GPUs integrated on-chip won't ultimately take over the low-end graphics market. As the real Moore's Law – the law about transistor densities, not clock rates – continues to march on, it's inevitable that on-chip integrated graphics will be just fine for low- and medium-range games. It just won't happen soon with Intel products.

Ah, but what about AMD? Their Fusion chips with integrated graphics, which they call APUs, are supposed to be rather good. Performance information leaked on message boards about their upcoming A4-3400, A6-3650 and A8-3850 APUs make them sound as good as, well, um, as good as Sandy Bridge was supposed to be. Hm.

Several years ago I heard a high-level AMD designer say that people looking for performance with Fusion were going to be disappointed; it was strictly a cost/performance product. That was several years ago, and things could have changed, but chip design lead times are still multi-year.

In any event, this time I think I'll wait until shipped products are tested before declaring victory.

Meanwhile, here I go again, flipping back and forth between laptop specs and GPU specs, as usual.

Sigh.


UPDATE May 23, 2011

Intel has just released new drivers for Sandy Bridge. The press release says they provide “up to 40% performance improvements on select games, support for the latest games like Valve’s Portal 2 and Stereoscopic 3D playback on DisplayPort monitors.”

At this time I don't know of test results that would confirm whether this really makes a difference, but if it’s real, and applies broadly enough, it might be just barely enough to make the Ivy Bridge chip the beginning of the end for low-end discrete graphics.