Why can't new video cards use less power when idle or in 2D?

Joined
Jan 23, 2008
Messages
58
Why can't they make a high end video card that idles and does 2d and uses little resource/LOW wattage?

I don't get it, a single ATI 4870 idling is like 138 watts.

My box with an E8400 over clocked with 3 SATA IDE hard drives and an older GeForce 7600 GT idles at 125 watts. (What out of that the video card represents I have no idea, I assume it can't be more the 40-50 watts)

It just seams so wasteful if I only game on my system 10% of the time to waste so much electricity. My system is on 24x7.

Some might say to have another system for gaming, I don't have that option. And it is ignoring the problem.

The 2D performance between the ATI 4870 and the 7600 GT I cannot tell the difference with.

Motherboard designers, CPU's, etc.. have become power conscious, I think it is time for video card makers to do the same.

Even if the card cost an additional 20 dollars because they had to add a low power graphics processor in addition to the main processor it would be worth it in electricity savings for the user. The low power is in use till D3D or whatever triggers the "gaming mode" It could even be that there is no extra cooling that needs to turn on until it goes into "gaming mode".

Does anyone see this changing anytime soon?

I personally won't be buying any high end video cards until that changes.
 
Happy New Year Monday! Ok, the 138 watts your getting must be total power consumption. As far
as high end video cards go, I think as far as idle power consumption goes, the gtx280 and 4870x2
have improved considerably if you look at the chart below. It's actually quite amazing that a 4870x2
at idle does better than a single 8800gtx, and a mammoth improvement over it's previous gen brother,
the 3870x2. This shows some remarkable improvement on ATI's end..and on Nvidia's end, the gtx260 and
gtx280 only consume like 140 watts or so at idle, which is a lot better than the previous gen 8800 ultra/gtx.

chart.jpg
 
Er, it did worse than it's earlier generation, you must have been looking at the crossfire comparison, which was 2 x 3870x2's (so 4-GPU's)

In terms of power-usage that is...
 
I don't get it, a single ATI 4870 idling is like 138 watts.

My box with an E8400 over clocked with 3 SATA IDE hard drives and an older GeForce 7600 GT idles at 125 watts. (What out of that the video card represents I have no idea, I assume it can't be more the 40-50 watts)

That 138 watts is total system draw, *not* for the card. I believe the 4870 at idle uses somewhere around 50w or less.
 
ya checking the 3870 x2 used less power then the 4870x2's..

but i do agree they should really eb able to throttle back, i mean how many people have undervolted their cards in 2D mode by dropping clocks even lower then already made 2d settings.
 
The OP is complaining that while his system with a 7600GT idles at 125W, his system with a 4870 idles at 138W. Apparently drawing 13W more at idle is objectionable, even though the 4870 is, what, 6 times faster?

Does the OP know about S3 Suspend mode?
 
Today's cards do scale back very well, they just don't do it automatically through the drivers, which is a shame. If it concerns you that much, grab a 3rd part app like ATI tray tools for ATI cards or Rivatuner for NVIDIA cards. I don't have a killawatt monitor to tell you the power draw difference, but for example with my 4870 the VDDC read by GPU-Z reads 25A idle at stock configs, but after modding reads only 6.8A. Therefore the GPU and it's circuitry are using ~1/4 power of stock. I can't tell you about the rest of the card as there are no sensor outputs, but the card is using a lot less power.
 
yah, whoops, didn't realize that was 3870x2s in CF. Well it's still a feat for one 4870x2
to use less power at idle than a single 8800gtx.
 
ya checking the 3870 x2 used less power then the 4870x2's..

but i do agree they should really eb able to throttle back, i mean how many people have undervolted their cards in 2D mode by dropping clocks even lower then already made 2d settings.

I would underclock my 260 in 2D mode, but because it's permanently stuck in Performance 3D I can only go down to 290/600 and have to mess about with Rivatuner profiles :mad:
Annoys the hell outta me as I had my 4870 running 160/400 at the desktop with no intervention required for it to ramp up to 860/1000 for gaming. The little screen flicker didn't bother me at all, and I wondered why people were setting identical 2D/3D clocks in the BIOS in order to avoid it.
 
I have been wondering this forever.

I have used three different video cards on this system and the most powerful, a Radeon 4850 uses more than 40 more watts than the cheap fill in card I used. I had a 7900GT before this and I saw the power consumption of the system jump 20 watts when I upgraded to the 4850.

There are technologies like hybrid power but I have no idea what combination of cards/motherboards you need to use to enable it because it sure as hell did not gain much traction and is hardly reported on. I thought it was cool since I am conscious of energy waste but I guess no one else thought it was cool.
 
4850 is a lot more powerful than a 7900gt however, and rivals the 8800gtx. You have to use more of
a performance to power raitio comparison to get a better idea of how power usage has improved. If you take a newer gen card that gives the same performance as a 7900gt, it's going to be a lot better
in both idle/load power consumption.

And as far as performance per watt goes, the 4850 truly dominates this arena.
 
I was not aware that the numbers being posted on sites I had been reading were FULL SYSTEM idle watts or FULL SYSTEM load. (I can see the chart above in this post clearly lists that)

One of the articles I was reading was from arstechnica, it does not say anywhere in the article that the power consumption for each video card they compare is the full system load.

If the article states power consumption for X video card then they give a chart with numbers for each video card I figured it was the actual power consumption for those listed cards, not the total power used by the system.

So it sounds like I could still get a higher end graphics card thats usage would idle closer to my current card then I was thinking.

I still don't get why there is not more focus on energy savings when it comes to video cards. I would take onboard video 2D level performance with same energy usage coupled with higher energy usage when gaming for a small premium. It would pay for itself.

Thanks
 
If you have a 4870, then just download ATT and make a profile for 550/200 @ 1.083v for idling, or just flash the bios. Here's what mine runs at idle.

http://img363.imageshack.us/img363/3231/idle4870po0.jpg

Not sure why ATi doesn't downclock the memory or voltage.

Speedstep like functionality for video card. I would think that should be a built in option.

Thanks will check that out, I was planning on picking up the 4870 and at first thought my idle watt usage was going to go from 125 to like 263 because the article I read did not indicate that the power usage was full system power.

Doing the math on that was not an extra 20 bucks a year where I am at.

I will just get a 4870 locally first to see what truly the difference is in idle usage.
 
I think they have done a good job to be honest to get it to around 150w idle as these things used to suck alot more power out 5 years ago.
 
When I was looking around I found this: (I have no idea how acurate it is, they actually try to estimate the actual draw of each card)

------------------------------------------------------------------------------------------------------------------------------

Lets look at power, a fairly imprecise and variable thing on the best of days. All measurements were taken with an Extech True RMS Power Analyzer, and were measured at the wall. Keep in mind that this means the numbers are bumped up by the inefficiency of the PSU, so if you want the true DC power consumed, knock 20-25 per cent off the top of these figures.

If you subtract out the power used by the single cards from the same cards in Crossfire, you end up with roughly what a single GPU uses. More or less. Sort of. It looks like this, in Watts of course.

Idle usage for card alone:

4870 - 93 watts single
4850 - 46 watts single
3870 - 33 watts single
2900xt - 72 watts single

-------------------------------------------------------------------------------------------------------------------------------

It's my opinion that ALL of the cards should be the same idle draw. At the most 33 watts.

If those numbers are correct the cost difference between the best Vs the worst, operating idle for a year where I am at is between 50-60 dollars with alternating cost in KW between .09 and .12 per KWh .

I am guessing a good designer could make the difference even greater between idle/2d and actual load on the GPU's.
 
Speedstep like functionality for video card. I would think that should be a built in option.

Thanks will check that out, I was planning on picking up the 4870 and at first thought my idle watt usage was going to go from 125 to like 263 because the article I read did not indicate that the power usage was full system power.

Doing the math on that was not an extra 20 bucks a year where I am at.

I will just get a 4870 locally first to see what truly the difference is in idle usage.
ATi cards do have powerplay, but it's not optimized at all.

According to the page where I got most of my info they saved around 45w on idle, with 160/225 @ 1.083v. Actually there's only a 1w difference between 550 and 160 on the core, which is why I use 550/200.

Here's the link to that info.
http://forums.techpowerup.com/showthread.php?t=67928
 
Speedstep like functionality for video card. I would think that should be a built in option.

Thanks will check that out, I was planning on picking up the 4870 and at first thought my idle watt usage was going to go from 125 to like 263 because the article I read did not indicate that the power usage was full system power.

Doing the math on that was not an extra 20 bucks a year where I am at.

I will just get a 4870 locally first to see what truly the difference is in idle usage.

my 260 does this

dont know why others doent but just sitting on the desktop its clock backed to 300 on the core and 100Mhz on the ram (full clock for the ram is like 1063MHz and core is 630)
it uses almost nothing at idle
and if you have a NV mobo with onboard you can run Hybrid SLi and totaly turn off the big card and use the lowpower onboard GPU
 
I still don't get why there is not more focus on energy savings when it comes to video cards. I would take onboard video 2D level performance with same energy usage coupled with higher energy usage when gaming for a small premium. It would pay for itself

Its called Hybridpower. It disables the video card and switches to the IGP when idling. There isn't a push for lower idle power usage because in general people don't care. ATI and nvidia aren't going to waste money on something that the users don't really care all that much about. I certainly didn't lose any sleep over the extra 10-20w my 4850 uses at idle vs. my old 7950GT. My solution is to simply put my rig into S3 sleep when I'm not using it.

There are technologies like hybrid power but I have no idea what combination of cards/motherboards you need to use to enable it because it sure as hell did not gain much traction and is hardly reported on. I thought it was cool since I am conscious of energy waste but I guess no one else thought it was cool.

The problem with Hybridpower and similar is that you need a motherboard with an IGP. What enthusiast boards actually have an IGP? The people who have the video cards that guzzle power while idling are the same ones who overclock or buy higher end motherboards that don't have IGPs.
 
Why can't they make a high end video card that idles and does 2d and uses little resource/LOW wattage?


It just seams so wasteful if I only game on my system 10% of the time to waste so much electricity. My system is on 24x7.

You seem to be contradicting yourself. If you're so worried about saving power, why are leaving your system on 24/7?
 
You seem to be contradicting yourself. If you're so worried about saving power, why are leaving your system on 24/7?

I game 10% of the time maybe 20% it really depends how much extra free time I have whatever given week, 80-90% of the other time I am doing other stuff on my PC that it cannot be in S3 sleep mode. How is it a contradiction to want my PC to use less power the other 80-90% of the time.

I will need to get a 4870 myself to see what the difference is.

Eventually it will be in the Video card makers interest to conserve power, most other components are moving that way already. (and it already sounds to some degree that there is a shift starting in video cards)
 
17349.png


A 60+ watt difference at idle between the highest and lowest seems like allot for both being the same at that point with 2D.
 
Complain to ATI to get their act together. If you want a card that consumes less power then sell your ATI and get a GTX 260 or 280. They will throttle way down at idle.

Right now my core clock is at 301 vs. 666 load.
 
I game 10% of the time maybe 20% it really depends how much extra free time I have whatever given week, 80-90% of the other time I am doing other stuff on my PC that it cannot be in S3 sleep mode. How is it a contradiction to want my PC to use less power the other 80-90% of the time.

I'm saying someone who's worried about saving electricity would shut off their comp while they're sleeping; unless you're a torrent junkie:p
 
My computers run 24/7 folding. The throttling down in between work units is a very nice feature.
Four 280s overclocked all throttle down inbetween units and it's a feature I'm glad nVidia decided to enable.
 
I'm saying someone who's worried about saving electricity would shut off their comp while they're sleeping; unless you're a torrent junkie:p


My PC has to be on all the time, if not having my PC fully awake was an option I would not be concerned about the increase in "idle" usage.

I am fully aware of tools that I could harness WOL by port knocking before I need remote access, etc.. It's not an option for me though.
 
By "idle" you mean "2D mode"?
Also my sliders in ATT only go down to 720 on memory.

Version 1.6.9.1340...
Those "limits" are arbitrarily set in the settings menu of ATI Tray Tools, and you can change them to whatever you want. My 4870 can go down to ~140MHz on the RAM before it starts corrupting the display, so I set it to 200MHz just to be safe.
 
Complain to ATI to get their act together. If you want a card that consumes less power then sell your ATI and get a GTX 260 or 280. They will throttle way down at idle.

Right now my core clock is at 301 vs. 666 load.

Tell that to my 260 - it doesn't do jack shit unless I manually adjust clocks, and I'm not the only person having this issue. At least with the 4870 I just had to set lower 2D clocks in the BIOS (well, and higher 3D clocks too) and hey presto I'm saving energy. I love my 260, but dicking around with Rivatuner whenever I enter/exit a game is a PITA
 
By "idle" you mean "2D mode"?
Also my sliders in ATT only go down to 720 on memory.

Version 1.6.9.1340...
Goto -> Tools and Options -> General Options -> Advanced

Uncheck overclocking limits.

Then make your profiles and put back the check if you want.

Edit

These are the only 2 settings I use.
550/200 @ 1.083v - normal stuff, movies, and light gaming ( Chess, tetris, .... )
820/975 @ 1.263v - commercial games ( COD5, Crysis, Fallout 3, .......)
 
Wow, I didn't realize the GTX 260 went that low at idle. I was under the impression that from the size of these things, they had snuck in a nuclear power plant somewhere in there. Before when I had an x850, I used ATI Tray Tools to auto downclock to the lowest possible. I'm glad nvidia has this 2D/3D mode thing going on, saves me all the hassle.
 
Wow, I didn't realize the GTX 260 went that low at idle. I was under the impression that from the size of these things, they had snuck in a nuclear power plant somewhere in there. Before when I had an x850, I used ATI Tray Tools to auto downclock to the lowest possible. I'm glad nvidia has this 2D/3D mode thing going on, saves me all the hassle.
yep my current system uses less power at idle than any system I have had in the last few years.
 
If you have a 4870, then just download ATT and make a profile for 550/200 @ 1.083v for idling, or just flash the bios. Here's what mine runs at idle.

http://img363.imageshack.us/img363/3231/idle4870po0.jpg

Not sure why ATi doesn't downclock the memory or voltage.


you sure about that?

My Diamond 4870 X2 clocks in at 502/500 mhz GPU/RAM when in 2d mode then kicks it up to 750mhz/900 Mhz ram when in 3d. Once the game is done. Back to the 502/500.

<-- Using 8.12
 
you sure about that?

My Diamond 4870 X2 clocks in at 502/500 mhz GPU/RAM when in 2d mode then kicks it up to 750mhz/900 Mhz ram when in 3d. Once the game is done. Back to the 502/500.

<-- Using 8.12
Sure the X2 does that, but not the 4870 single. Even those clocks could be better on the X2 I would assume.

VDDC Current = 4.1A @ 550/200 1.083v
VDDC Current = 10.8A @ 507/500 1.083v
VDDC Current = 23.0A @ 500/900 1.203v ( stock idle clock on my 4870 1GB )

If you look at the charts the 4870x2 only uses 15w more than a 4870 512 and that's with a extra 1.5GB and a bridge chip. Sorry ATi could do much better.

http://www.hardwarecanucks.com/foru...uperclocked-edition-video-card-review-21.html
 
Back
Top