31 March 2008

Lifetime Electricity Costs for High-end Video Cards

So a few weeks ago I discussed how we're going to need a greater emphasis on low power consumption electronic design in the future. I thought it would be helpful to actually put down some numbers and see how big a cost power consumption is for high-end computer equipment. On some equipment, especially that used in server farms such as hard drives, power draw is already an important metric. For other components, we often are not even given consumption numbers by the manufacturers.

High-end graphics cards are becoming particularly power hungry, as this chart by anandtech.com shows. The two current best-performance at a reasonable price-point video cards are the 8800GT by NVidia and Radeon 3870 by ATI. Even idling, these systems chew through an impressive amount of power — 165 W in the case of the NVidia product and 125 W for the ATI one. I don't know if anandtech.com's methodology is normalized for the efficiency of the power supply or not, but regardless they are burning a lot of power for doing next to nothing. Average residential electricity prices are up to 10.4 ¢/kW·h now in the US for 2006.

Let's assume that the lifetime of a card is always on at idle for two years. "Idle" in this case would basically extend to using any 2-D application, such as browsing the internet or using a word processor. Only 3-D accelerated games are going to stress these systems to any significant degree.

Video Card

NVidia 8800GT

ATI Radeon 3870

Initial Purchase Price

$260

$255

Idle Power Consumption

165 W

125 W

Expected Lifetime

17500 hours

17500 hours

Lifetime Est. Power Consumption

2187.5 kW·h

2887.5 kW·h

Lifetime Electricity Cost

$300.30

$227.50

Total Cost

$560.30

$482.50


As we can see, the operational cost of these two cards is roughly comparable to the purchase price. Even with the modern price of gasoline, automobiles don't have such a high proportion of their lifetime cost associated with fuel.

Both of these cards come with 512 MB of fast memory spread out over eight chips. However, a single buffer at 1280x1024 pixels with 32-bit resolution requires less than 6 MB of RAM, so there's no need to maintain all that memory powered on for the vast majority of computer applications. One chip should suffice for a triple buffered display.

Similarly, it should be technically possible to clock down the processor and bus speeds dynamically to reduce the power consumption of the GPU. Alternatively, one could embed a slow GPU for 2D applications. Most motherboards are available with on-board video on the Northbridge chipset which is just fine for web browsing (useful if you ever want to flash the BIOS on your video card BTW). The marginal cost of on-board graphics is probably around $5 to the manufacturer.

I find it somewhat surprising that neither of the major graphics manufacturers have tried to radically improve the power performance of their cards. There is, potentially, a major competitive advantage to be had. For example, if ATI was to spend $5 per card and drop the idle power requirement to 1/8th that of the Nvidia model, and advertise that fact and the estimated savings aggressively, they could recapture a lot of the market share they've ceded since the heyday of the Radeon 9700 Pro.

2 comments:

Anonymous said...

AMD already uses the low-power secondary video hardware idea in their Puma laptop platform.

see for example
http://www.reghardware.co.uk/2008/05/23/amd_puma_june/
(late in the article they talk about letting the main GPU power down when running on batteries.)

Interesting thought about powering down all but one RAM chip, but that would make the transition slow because you'd need to copy a bunch of data around to get the stuff you need all on one DRAM chip. Also, you'd probably need two memory controllers, one low power and one normal. It's not just the framebuffer that's used in 2D mode, either. off-screen images are cached in video RAM. Fonts glyphs might be, too, depending on the driver/toolkit software architecture.

But I totally agree with your overall point, that video cards are hogs, and I would really like to be able to have a fast vid card in my on-all-the-time computer without wasting 20W all the time.

Anonymous said...

Those figures look wrong. They look like total system numbers. The 9800gtx+ uses 60 W idle(at the plug) and I'm not aware of it introducing any new power saving mechanisms since 8800gtx.

All video cards have had a 2d mode for years now. They reduce the clock-frequency of the core and memory chips. Some of the newer cards switch of the die area dealing with shaders in 2d.