DarkLegacy
[H]ard|Gawd
- Joined
- Dec 26, 2004
- Messages
- 1,097
All those Air Conditioner manufactuers are loving this. If the power consumption for the R600 cards hold true, when a summer blackout occurs, everyone will be like "It's AMD/ATI's fault =P.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Like when, exactly, sir?
A pathetic thing to do to consumers? DO TO?
Nobody has said that, sir.
About always... even just in this post!
Do to, yes, it is. Quite frankly, NO ONE wants to have video cards that take up this much power, but well, here we are and they are forcing a higher power draw standard that no one wants. Would I say it's done personally to anyone, no, but it is something being done to consumers. You can't fault that in any way. People talk about DRM as "doing it to" consumers all the time, how is this in any way different? It's not.
You act like it just fine without directly saying it, trust me . Patronizing me with comments like this is just plain pathetic, to be quite frank. You ought to give it up, it doesn't lend your opinions any more credibility to anyone, I'm sure.
The R600 has been delayed cause they're trying to make it LONGER and use MORE POWER so the GTX will pwn it in every way. Sheesh
Wait, so what you're saying is, ATi is FORCING people to buy inefficient cards. Right, if people really didn't want a card that consumes that much power, they wont buy it, plain and simple.
And, you are basing an entire argument of rumored specs, brilliant!
Funny, rumored specs are what everyone is saying to wait for the R600 for ... it's the best info we have right now, though, as far as the power consumption goes, it seems reasonable since this particular info came with OEM pictures.
ATI is not giving us another choice if we want a high-end card but to buy one with high power consumption. If their strategy succeeds, nVidia will follow suit, and that'll be all we even have to choose from. This sort of process should be self-explanatory, but I guess some people don't think beyond the here and now.
What I find funny is the rumor mill. Why hasn't ATI released any specs yet? What are they trying to hide? Could the rumored engineering issues be true? I look at it this way. If something walks like a duck, quacks like a duck, and flies like a duck, it must be a duck. The way ATI has this bottled up, and the not so glowing rumors we here going around about the R600 drawing too much power for it's performance ratio because ATI had no choice but to simply toss more power at the core because of engineering screw ups....I start to wonder if it's true. I mean, the card is being delayed until March, when it was supposed to be out already. Why the delay? Currently Nvidia is spanking you in the open market, even with their crappy drivers...if the card is that stellar, and has no issues, what's the hold up? Why the total lack of info from ATI?
Smells fishy to me. Yah, rumors are rumors, and the benchmarks are BS....but the total lack of info from ATI, and the delays on the card, start to lend weight to the rumors and start to make you go "Hmmm".
I don't understand the logic behind faulting ATi for making a power hungry graphics card, thats the way they make it, its not like ATi has a graphics card monopoly and is forcing any one that wants a high end card to drain power, get over it, ATi isnt the root of all evil.
QF(bloodyobvious)T.
edit: And yet depending on performance and featureset we are yet to know if it is "inefficient". What I don't understand is why people like me (3D geeks) would want a new card - any new card - to be slower. FFS its all about the 3D people!!!!!!!
It's only nVidia people who are complaining.
Don't think it's actually representative of any larger body of people.
If the reverse were true and nVidia cards used more power, then they wouldn't care less about power consumption, and I'd put money on that.
In short, they don't actually care that the cards use more power, cause their 8800's use more power than their 7900's. They just care that ATI's cards use more power, that's all.
Now that's a pretty cynical and conspiracy theoristic view. I say you are wrong. I say people have finally woken up and are saying "NO" to energy inefficient products. Going by your logic, the inefficiency of Ford's trucks is only an imaginary scenario created by all the Toyota fans who want to make Ford look as bad as possible. Now I know you're going to strike back at my analogy by saying "but Ford tucks are inefficient, there's no imagination there!" and to that I say: more than likely, the same thing applies to R600.
No. I say people have finally woken up, and are moving away from power-sucking monsters whose performance does not justify the damage to the environment. Look at 4x4; it performs the same and costs the same as a Kentsfield setup if you are building from the ground up. Why hasn't 4x4 sold as good as Kentsfield? Or maybe look at the Pentium 4: it was almost as fast at A64, and Hyperthread actually made it faster than A64 in normal everyday use, so why did people move away from it? Because its power draw didn't match its performance.
You say its an army of fans. I say you are wrong. I say consumers have finally realized that they need to use some common sense.
Funny, rumored specs are what everyone is saying to wait for the R600 for ... it's the best info we have right now, though, as far as the power consumption goes, it seems reasonable since this particular info came with OEM pictures.
ATI is not giving us another choice if we want a high-end card but to buy one with high power consumption. If their strategy succeeds, nVidia will follow suit, and that'll be all we even have to choose from. This sort of process should be self-explanatory, but I guess some people don't think beyond the here and now.
Oddly enough the only reasons i have seen people say wait for R600 to launch is to buy G80's cheaper I cant recall one person saying "Wait for R600 because it will be faster" except for the ATI flavor section all i see is "Wait for R600 because the 8800's will drop in price".
I dont care about either brand just thought that was interesting.
In short, they don't actually care that the cards use more power, cause their 8800's use more power than their 7900's. They just care that ATI's cards use more power, that's all.
About always... even just in this post!
Do to, yes, it is. Quite frankly, NO ONE wants to have video cards that take up this much power, but well, here we are and they are forcing a higher power draw standard that no one wants. Would I say it's done personally to anyone, no, but it is something being done to consumers. You can't fault that in any way. People talk about DRM as "doing it to" consumers all the time, how is this in any way different? It's not.
You act like it just fine without directly saying it, trust me . Patronizing me with comments like this is just plain pathetic, to be quite frank. You ought to give it up, it doesn't lend your opinions any more credibility to anyone, I'm sure.
I think it is sad that all these R600 threads have devolved into stupid Nvidiot Vs ATIdiots flame wars. I used to check these threads in the hope that someone had some new rumors or speculation about the upcoming ATI card. Now all there is are idiots whining about how long the card is even though it's the OEM version and the retail version is shorter then the G80 (allegedly) or how much power it consumes. People need to get a life and stop worrying over which company is better at any given time. They are both great companies with good products and competition is good for everyone.
If that chart is right the X2800XL and X2800GT will be going up against the 8800GTS and have roughly the same specs. I wonder if nVidia will be releasing an 8850 or 8900 series...
Nobody that can say really knows just how the R600 will perform.
HA! You can't fool me, that's just a Voodoo 5 6000 with an ATI branded heatsink on it!
I'm not sure why ~30W more than a 8800GTX with 256MB of additional RAM is considered to be that excessive. The max power draw of the connectors is 225W. An 8800GTX is ~170W so 200W for R600 seems fairly reasonable considering what you're getting. And going by the 6+8 connector setup that was shown on the cards I'm guessing power usage is really close to the 225W limit. If you OC the card or up the voltage that power usage can go up real quickly so keeping the OCing community happy never hurt. Plus those OEM cards could very well be intended to go into rackmounts and used as stream processors and not GPUs. Where heat and power usage or of a somewhat less concern.
And going off that driver article that was posted I'd say the drivers should be about what they are now if not better. Other than the additional datatype formats I can't think of any DX10 features that couldn't be ran on a R580, including GS. Plus the DX10 model seems relatively simplified so I'd imagine it could be even less work than DX9 drivers.
I worded opinions as fact in my last post? Do explain, sir.About always... even just in this post!
I believe otherwise, and I'm fairly confident that many here welcome my "taking you down a peg", but that's a fair opinion to have.Patronizing me with comments like this is just plain pathetic, to be quite frank. You ought to give it up, it doesn't lend your opinions any more credibility to anyone, I'm sure.
I don't disagree with this, which is why I prefer the term "fanATic".That's_Corporate said:nVidiot is much better than ATidiot.
I believe you're making assumptions here. A 40W increase in video card consumption may be offset by turning off a single light bulb, using warm water rather than hot water, upgrading a single kitchen appliance to an Energy Star certified appliance (of which many will save thousands of watt hours every month), or, quite simply, setting your computer to standby for a period of fifteen to forty-five minutes (depending on the configuration of the machine).InorganicMatter said:No. I say people have finally woken up, and are moving away from power-sucking monsters whose performance does not justify the damage to the environment.
I worded opinions as fact in my last post? Do explain, sir.
I believe otherwise, and I'm fairly confident that many here welcome my "taking you down a peg", but that's a fair opinion to have.
I don't disagree with this, which is why I prefer the term "fanATic".
I believe you're making assumptions here. A 40W increase in video card consumption may be offset by turning off a single light bulb, using warm water rather than hot water, upgrading a single kitchen appliance to an Energy Star certified appliance (of which many will save thousands of watt hours every month), or, quite simply, setting your computer to standby for a period of fifteen to forty-five minutes (depending on the configuration of the machine).
You're also assuming that all power generated in the United States (or in other countries) results in damage to the environment. Between hydroelectric, fission nuclear, solar and wind-turbine generated power, only the former two present significant dangers to the environment. The former has a tendency to upset native ecosystems, while the latter has the unfortunate side-effect of producing radioactive waste (which one could cost-effectively shoot into space). Many live in areas that are predominantly fueled by methods of power generation that are not significantly destructive to the environment.
You could say that installing a General Electric turbine is "damaging to the environment", but that's a fairly radical stance.
Although I think we need to be concerned about power consumption from a consumer perspective, I don't think we need to lose our wits about us because of thirty or forty watts.
I guess it depends, yeah. Some might take offense, others may not.
I've been called a nVidiot, a fanATic, fanb0y, fanbot, nVidia zealot and, my favorite, "paid NVIDIA shill". All are rather entertaining, quite frankly.
I believe otherwise, and I'm fairly confident that many here welcome my "taking you down a peg", but that's a fair opinion to have.
I guess it depends, yeah. Some might take offense, others may not.
I've been called a nVidiot, a fanATic, fanb0y, fanbot, nVidia zealot and, my favorite, "paid NVIDIA shill". All are rather entertaining, quite frankly.
I'll freely admit to being a "Wiitard"!Yea but Wiitard takes the cake.
Agreed.If that's all they can come up with to try and debase someone's arguments, they're doing pretty darn well.
So have I over the years, guess what, who the heck cares? If that's all they can come up with to try and debase someone's arguments, they're doing pretty darn well.
I'll freely admit to being a "Wiitard"!