ATi R600 vs 8800GTX - New Test & Scores (X2800XTX - X2800XT)

Status
Not open for further replies.
All those Air Conditioner manufactuers are loving this. If the power consumption for the R600 cards hold true, when a summer blackout occurs, everyone will be like "It's AMD/ATI's fault =P.
 
Like when, exactly, sir?


A pathetic thing to do to consumers? DO TO?


Nobody has said that, sir.



About always... even just in this post!

Do to, yes, it is. Quite frankly, NO ONE wants to have video cards that take up this much power, but well, here we are and they are forcing a higher power draw standard that no one wants. Would I say it's done personally to anyone, no, but it is something being done to consumers. You can't fault that in any way. People talk about DRM as "doing it to" consumers all the time, how is this in any way different? It's not.

You act like it just fine without directly saying it, trust me ;). Patronizing me with comments like this is just plain pathetic, to be quite frank. You ought to give it up, it doesn't lend your opinions any more credibility to anyone, I'm sure.
 
About always... even just in this post!

Do to, yes, it is. Quite frankly, NO ONE wants to have video cards that take up this much power, but well, here we are and they are forcing a higher power draw standard that no one wants. Would I say it's done personally to anyone, no, but it is something being done to consumers. You can't fault that in any way. People talk about DRM as "doing it to" consumers all the time, how is this in any way different? It's not.

You act like it just fine without directly saying it, trust me ;). Patronizing me with comments like this is just plain pathetic, to be quite frank. You ought to give it up, it doesn't lend your opinions any more credibility to anyone, I'm sure.

Wait, so what you're saying is, ATi is FORCING people to buy inefficient cards. Right, if people really didn't want a card that consumes that much power, they wont buy it, plain and simple.

And, you are basing an entire argument of rumored specs, brilliant!
 
Wait, so what you're saying is, ATi is FORCING people to buy inefficient cards. Right, if people really didn't want a card that consumes that much power, they wont buy it, plain and simple.

And, you are basing an entire argument of rumored specs, brilliant!

Funny, rumored specs are what everyone is saying to wait for the R600 for :p ... it's the best info we have right now, though, as far as the power consumption goes, it seems reasonable since this particular info came with OEM pictures.

ATI is not giving us another choice if we want a high-end card but to buy one with high power consumption. If their strategy succeeds, nVidia will follow suit, and that'll be all we even have to choose from. This sort of process should be self-explanatory, but I guess some people don't think beyond the here and now.
 
So let me get this right GoldenTiger: If NV release a power hungry card it will be ATI's fault for doing it first. And then there will be no going back?

Man you should be a lawyer - you have a knack for FUD.
 
Funny, rumored specs are what everyone is saying to wait for the R600 for :p ... it's the best info we have right now, though, as far as the power consumption goes, it seems reasonable since this particular info came with OEM pictures.

ATI is not giving us another choice if we want a high-end card but to buy one with high power consumption. If their strategy succeeds, nVidia will follow suit, and that'll be all we even have to choose from. This sort of process should be self-explanatory, but I guess some people don't think beyond the here and now.

nVidia is giving the consumer another choice though, if the majority of consumers want a high end card that doesn't suck power, they can go nvidia, if they don't care (like most people that buy high end equipment) they'll buy ATI (assuming it performs better). And even still, if nVidia pursues, then the consumers will have no one to blame but them selves.

This sort of process IS self explanatory, I don't understand the logic behind faulting ATi for making a power hungry graphics card, thats the way they make it, its not like ATi has a graphics card monopoly and is forcing any one that wants a high end card to drain power, get over it, ATi isnt the root of all evil.
 
What I find funny is the rumor mill. Why hasn't ATI released any specs yet? What are they trying to hide? Could the rumored engineering issues be true? I look at it this way. If something walks like a duck, quacks like a duck, and flies like a duck, it must be a duck. The way ATI has this bottled up, and the not so glowing rumors we here going around about the R600 drawing too much power for it's performance ratio because ATI had no choice but to simply toss more power at the core because of engineering screw ups....I start to wonder if it's true. I mean, the card is being delayed until March, when it was supposed to be out already. Why the delay? Currently Nvidia is spanking you in the open market, even with their crappy drivers...if the card is that stellar, and has no issues, what's the hold up? Why the total lack of info from ATI?

Smells fishy to me. Yah, rumors are rumors, and the benchmarks are BS....but the total lack of info from ATI, and the delays on the card, start to lend weight to the rumors and start to make you go "Hmmm".

The G80 and G7x00's didnt have allot of rumors either.
A lack of rumors doesnt mean anything.
 
I don't understand the logic behind faulting ATi for making a power hungry graphics card, thats the way they make it, its not like ATi has a graphics card monopoly and is forcing any one that wants a high end card to drain power, get over it, ATi isnt the root of all evil.


QF(bloodyobvious)T.


edit: And yet depending on performance and featureset we are yet to know if it is "inefficient". What I don't understand is why people like me (3D geeks) would want a new card - any new card - to be slower. FFS its all about the 3D people!!!!!!!
 
QF(bloodyobvious)T.


edit: And yet depending on performance and featureset we are yet to know if it is "inefficient". What I don't understand is why people like me (3D geeks) would want a new card - any new card - to be slower. FFS its all about the 3D people!!!!!!!

It's only nVidia people who are complaining.
Don't think it's actually representative of any larger body of people.

If the reverse were true and nVidia cards used more power, then they wouldn't care less about power consumption, and I'd put money on that.

In short, they don't actually care that the cards use more power, cause their 8800's use more power than their 7900's. They just care that ATI's cards use more power, that's all.
 
It's only nVidia people who are complaining.
Don't think it's actually representative of any larger body of people.

If the reverse were true and nVidia cards used more power, then they wouldn't care less about power consumption, and I'd put money on that.

In short, they don't actually care that the cards use more power, cause their 8800's use more power than their 7900's. They just care that ATI's cards use more power, that's all.

Now that's a pretty cynical and conspiracy theoristic view. I say you are wrong. I say people have finally woken up and are saying "NO" to energy inefficient products. Going by your logic, the inefficiency of Ford's trucks is only an imaginary scenario created by all the Toyota fans who want to make Ford look as bad as possible. Now I know you're going to strike back at my analogy by saying "but Ford tucks are inefficient, there's no imagination there!" and to that I say: more than likely, the same thing applies to R600. ;)

No. I say people have finally woken up, and are moving away from power-sucking monsters whose performance does not justify the damage to the environment. Look at 4x4; it performs the same and costs the same as a Kentsfield setup if you are building from the ground up. Why hasn't 4x4 sold as good as Kentsfield? Or maybe look at the Pentium 4: it was almost as fast at A64, and Hyperthread actually made it faster than A64 in normal everyday use, so why did people move away from it? Because its power draw didn't match its performance.

You say its an army of fans. I say you are wrong. I say consumers have finally realized that they need to use some common sense.
 
Now that's a pretty cynical and conspiracy theoristic view. I say you are wrong. I say people have finally woken up and are saying "NO" to energy inefficient products. Going by your logic, the inefficiency of Ford's trucks is only an imaginary scenario created by all the Toyota fans who want to make Ford look as bad as possible. Now I know you're going to strike back at my analogy by saying "but Ford tucks are inefficient, there's no imagination there!" and to that I say: more than likely, the same thing applies to R600. ;)

No. I say people have finally woken up, and are moving away from power-sucking monsters whose performance does not justify the damage to the environment. Look at 4x4; it performs the same and costs the same as a Kentsfield setup if you are building from the ground up. Why hasn't 4x4 sold as good as Kentsfield? Or maybe look at the Pentium 4: it was almost as fast at A64, and Hyperthread actually made it faster than A64 in normal everyday use, so why did people move away from it? Because its power draw didn't match its performance.

You say its an army of fans. I say you are wrong. I say consumers have finally realized that they need to use some common sense.

You're correct, however $600 video cards are not usually considered normal 'consumer' goods. They're enthusiast goods. You think the fuel efficiency of the Bugatti Veyron is really causing a huge consumer uproar? Nope, cause it's a small market. Same with high end 3D cards.
 
Funny, rumored specs are what everyone is saying to wait for the R600 for :p ... it's the best info we have right now, though, as far as the power consumption goes, it seems reasonable since this particular info came with OEM pictures.

ATI is not giving us another choice if we want a high-end card but to buy one with high power consumption. If their strategy succeeds, nVidia will follow suit, and that'll be all we even have to choose from. This sort of process should be self-explanatory, but I guess some people don't think beyond the here and now.

Oddly enough the only reasons i have seen people say wait for R600 to launch is to buy G80's cheaper I cant recall one person saying "Wait for R600 because it will be faster" except for the ATI flavor section all i see is "Wait for R600 because the 8800's will drop in price".

I dont care about either brand just thought that was interesting.
 
Oddly enough the only reasons i have seen people say wait for R600 to launch is to buy G80's cheaper I cant recall one person saying "Wait for R600 because it will be faster" except for the ATI flavor section all i see is "Wait for R600 because the 8800's will drop in price".

I dont care about either brand just thought that was interesting.

thats because the majority of people don't buy $600 videocards, so the release of a new product will drop old products pricing =) I wouldn't mind a $200 1950XTX
 
In short, they don't actually care that the cards use more power, cause their 8800's use more power than their 7900's. They just care that ATI's cards use more power, that's all.

This is so true with true fanatics of either side. Never do they accept that the "enemies" side is better in any way and they go on endlessly throwing up arguments. If the card is faster , well then their card uses less power, is more silent, is smaller in size and the list goes on and on. You dont have to be a rocket scientist to figur out how those peoples logic works.

I have never said this in a forum, but dont feed the troll. By ignoring it, we have best chance of getting rid of it.
 
About always... even just in this post!

Do to, yes, it is. Quite frankly, NO ONE wants to have video cards that take up this much power, but well, here we are and they are forcing a higher power draw standard that no one wants. Would I say it's done personally to anyone, no, but it is something being done to consumers. You can't fault that in any way. People talk about DRM as "doing it to" consumers all the time, how is this in any way different? It's not.

You act like it just fine without directly saying it, trust me ;). Patronizing me with comments like this is just plain pathetic, to be quite frank. You ought to give it up, it doesn't lend your opinions any more credibility to anyone, I'm sure.

I hear that it's a rumor that the second power port on the r600 is only for overclocking purpose's. Funny how that is a RUMOR. And to go with that means the power draw is considerably less then what RUMOR mill has it. GoldtenTiger UNLESS you have something CONSTRUCTIVE to put forth. GTFO, your constant bias'd remarks are annoying and quite frankly irritating. Also by not providing any constructive critisizm you too are providing less and less credibiltiy to the table then ever before. And one last point I am waiting for the r600 for several reason's...i want to see what ATI/AMD have in store for increased resolutions.....i would like to surround game with 3 dell 2405 fpw's :)
 
Nobody that can say really knows just how the R600 will perform. Should it be faster than the 800? Sure, I hope so, it is arriving sometime later. Then nVidia hits their refresh and hops ahead again- maybe.

better, cheaper, faster. Gotta love it.

I'll admit that I love my current ATI card- granted it' solder, but stable and works.

The next card is a wait and see since it'll be a new build system.
 
I think it is sad that all these R600 threads have devolved into stupid Nvidiot Vs ATIdiots flame wars. I used to check these threads in the hope that someone had some new rumors or speculation about the upcoming ATI card. Now all there is are idiots whining about how long the card is even though it's the OEM version and the retail version is shorter then the G80 (allegedly) or how much power it consumes. People need to get a life and stop worrying over which company is better at any given time. They are both great companies with good products and competition is good for everyone.
 
I think it is sad that all these R600 threads have devolved into stupid Nvidiot Vs ATIdiots flame wars. I used to check these threads in the hope that someone had some new rumors or speculation about the upcoming ATI card. Now all there is are idiots whining about how long the card is even though it's the OEM version and the retail version is shorter then the G80 (allegedly) or how much power it consumes. People need to get a life and stop worrying over which company is better at any given time. They are both great companies with good products and competition is good for everyone.

nVidiot is much better than ATidiot.
Sounds more natural.
 
Make note that most of these people posting this "NEW !!!NVIDIA!!! or ATI!!!! BLAH BLAH" information and then adding comments like "X company vs. Y company WIN!! WIN!!" are n00bie posters.

It then generates a long thread and/or discussion about said product.

If that isn't marketing, I don't know what is. I decided to stop posting in the Video Card section here a few days ago, because frankly, the marketers and PR personnel on this forum have gotten out of hand. After looking around though, this isn't limited to here, it's *everywhere* there is a busy forum, so I might as well come back to the place where I get the best news.

If the real posters here and elsewhere would actually stop posting in these garbage threads, the marketers can sit in threads by themselves and try to outsell each other. Hopefully they will eventually go away.

It's gotten to the point where it is sickening, and it isn't just one camp, either. It's BOTH of them, and most of you will see similarities if you actually look at what these characters are posting on BOTH the nVidia and ATI sides.

I love the [H], and this crap just brings it down. The only people that can affect it are the *real* posters that reject this nonsense. I will continue to visit this forum, and post here, but not in these types of blatant PR tactic threads. The OP of this one got banned for idiocy, AS HE WELL SHOULD HAVE, but if you look in this thread there are PR people from both camps here.

For the *real* posters, I'd suggest you ask yourself if you really believe they aren't doing it to get free stuff or remuneration of some sort. Once you have come to your conclusions, decide whether you really want to argue with a company PR spokesperson, and if you think it will be productive.

There isn't any need to name names, or go on a crusade against the individuals doing it - that isn't productive either. I'd hate to see *real* posters get banned over petty, BS marketing flames.

I somewhat cross-posted a reply to a similar thread in the nVidia forum for the sake of impartiality. I can find reasons to hate products from every company on a completely individual basis - they all want our money in the end and will go to any lengths to get it. Remember that when you read these threads and decide to take sides.

My post in the nVidia forum for reference
 
all this fuss over a card that looks about the same as the other brand performance wise lol

I guess we need something to waste our time on.
 
Nobody that can say really knows just how the R600 will perform.

A few people are saying that it will likely be faster based on the few known specs of R600. ATI has typically scaled better than Nvidia when higher levels of AA are used due to their memory controller. The 512bit bus has been confirmed and assuming 1.1GHz memory(Not the 1.4GHz stuff that is actually out there) will have at least ~60% more memory bandwidth.

The chips should be roughly the same size although Nvidia pulled NVIO off the chip and ATI is using an 80nm process instead of 90nm. Unlike Nvidia, ATI had already implemented a bunch of the expensive DX10 features with R5 series. So it's not expected that a lot of the transistors are going to be eaten up by going to DX10. So if you take a direct doubling of R580 then use the ~25% higher clockspeed R600 should be roughly 2.5x faster than R580+. That doesn't include the boost that will come with going unified either, which would be rather significant in games that were heavily limited by vertex or pixel shaders. And unless CPU limited, should be just about all games out there to some degree.

Now going off the benchmarks we have G80 is say 2x the speed of R580+ on average. So that should put R600 about 25% faster than G80 disregarding the move to a unified setup and any other improvements that may have been made.
 
If there's any truth to the benches, it makes me wonder if the X2800XTX is going to be another X800XT-PE, overcranked to beat the competition's high end by about 10% for the reviews/benchmarks, but made from cherry-picked cores and only available in tiny quantities. The high power consumption seems to lend support to this theory, since you always have to juice Golden Sample cards beyond reason due to the diminishing returns on the voltage as you push the circuit design too hard.

Then again, if the 8900 cards hit at the same time as these, it might not matter one way or the other. Unless nVidia still doesn't have truly effective Vista drivers and ATi does (DX10, I mean).
 
I'm not sure why ~30W more than a 8800GTX with 256MB of additional RAM is considered to be that excessive. The max power draw of the connectors is 225W. An 8800GTX is ~170W so 200W for R600 seems fairly reasonable considering what you're getting. And going by the 6+8 connector setup that was shown on the cards I'm guessing power usage is really close to the 225W limit. If you OC the card or up the voltage that power usage can go up real quickly so keeping the OCing community happy never hurt. Plus those OEM cards could very well be intended to go into rackmounts and used as stream processors and not GPUs. Where heat and power usage or of a somewhat less concern.

And going off that driver article that was posted I'd say the drivers should be about what they are now if not better. Other than the additional datatype formats I can't think of any DX10 features that couldn't be ran on a R580, including GS. Plus the DX10 model seems relatively simplified so I'd imagine it could be even less work than DX9 drivers.
 
I'm not sure why ~30W more than a 8800GTX with 256MB of additional RAM is considered to be that excessive. The max power draw of the connectors is 225W. An 8800GTX is ~170W so 200W for R600 seems fairly reasonable considering what you're getting. And going by the 6+8 connector setup that was shown on the cards I'm guessing power usage is really close to the 225W limit. If you OC the card or up the voltage that power usage can go up real quickly so keeping the OCing community happy never hurt. Plus those OEM cards could very well be intended to go into rackmounts and used as stream processors and not GPUs. Where heat and power usage or of a somewhat less concern.

And going off that driver article that was posted I'd say the drivers should be about what they are now if not better. Other than the additional datatype formats I can't think of any DX10 features that couldn't be ran on a R580, including GS. Plus the DX10 model seems relatively simplified so I'd imagine it could be even less work than DX9 drivers.

Re: Drivers, yeah, that's what I mean--based on Vista track record so far, it looks like ATi has a good chance to outdo nV in this area, even in DX10.

Re: Power, I'm not talking about excessiveness, just about what the power draw might tell us about how close the R600 core is being pushed to the edge for a marketing win. If the raw wattage difference seems relatively small, consider that the R600 core is 80nm process and the G80 is 90nm. This means that the G80 is intrinsically less power-efficient, so the R600's additional power usage is proportionally even more significant.
 
About always... even just in this post!
I worded opinions as fact in my last post? Do explain, sir.

Patronizing me with comments like this is just plain pathetic, to be quite frank. You ought to give it up, it doesn't lend your opinions any more credibility to anyone, I'm sure.
I believe otherwise, and I'm fairly confident that many here welcome my "taking you down a peg", but that's a fair opinion to have.

That's_Corporate said:
nVidiot is much better than ATidiot.
I don't disagree with this, which is why I prefer the term "fanATic".

InorganicMatter said:
No. I say people have finally woken up, and are moving away from power-sucking monsters whose performance does not justify the damage to the environment.
I believe you're making assumptions here. A 40W increase in video card consumption may be offset by turning off a single light bulb, using warm water rather than hot water, upgrading a single kitchen appliance to an Energy Star certified appliance (of which many will save thousands of watt hours every month), or, quite simply, setting your computer to standby for a period of fifteen to forty-five minutes (depending on the configuration of the machine).

You're also assuming that all power generated in the United States (or in other countries) results in damage to the environment. Between hydroelectric, fission nuclear, solar and wind-turbine generated power, only the former two present significant dangers to the environment. The former has a tendency to upset native ecosystems, while the latter has the unfortunate side-effect of producing radioactive waste (which one could cost-effectively shoot into space). Many live in areas that are predominantly fueled by methods of power generation that are not significantly destructive to the environment.

You could say that installing a General Electric turbine is "damaging to the environment", but that's a fairly radical stance.

Although I think we need to be concerned about power consumption from a consumer perspective, I don't think we need to lose our wits about us because of thirty or forty watts.
 
Yea but fanatic isn't an insult.
I mean, I've been called a hockey fanatic before. I never took offence.
Now if they called me a hockey idiot, I might, but that implies something totally different.

nVidiot FTW
It works so well it's almost as though nVidiot was the origional name of the company.
 
I guess it depends, yeah. Some might take offense, others may not.

I've been called a nVidiot, a fanATic, fanb0y, fanbot, nVidia zealot and, my favorite, "paid NVIDIA shill". All are rather entertaining, quite frankly.
 
I worded opinions as fact in my last post? Do explain, sir.


I believe otherwise, and I'm fairly confident that many here welcome my "taking you down a peg", but that's a fair opinion to have.


I don't disagree with this, which is why I prefer the term "fanATic".


I believe you're making assumptions here. A 40W increase in video card consumption may be offset by turning off a single light bulb, using warm water rather than hot water, upgrading a single kitchen appliance to an Energy Star certified appliance (of which many will save thousands of watt hours every month), or, quite simply, setting your computer to standby for a period of fifteen to forty-five minutes (depending on the configuration of the machine).

You're also assuming that all power generated in the United States (or in other countries) results in damage to the environment. Between hydroelectric, fission nuclear, solar and wind-turbine generated power, only the former two present significant dangers to the environment. The former has a tendency to upset native ecosystems, while the latter has the unfortunate side-effect of producing radioactive waste (which one could cost-effectively shoot into space). Many live in areas that are predominantly fueled by methods of power generation that are not significantly destructive to the environment.

You could say that installing a General Electric turbine is "damaging to the environment", but that's a fairly radical stance.

Although I think we need to be concerned about power consumption from a consumer perspective, I don't think we need to lose our wits about us because of thirty or forty watts.

or switching to newer 14-26w flourescent bulbs :D

http://www.azpartsmaster.com/shopazp/Fluorescent+Lamps.html

either way, Another view you dont want to pay the price, you don't get to play the game,
if you're so worried, stop trying to convince everyone and go buy a G80 or wait for a G81 based card,
 
I guess it depends, yeah. Some might take offense, others may not.

I've been called a nVidiot, a fanATic, fanb0y, fanbot, nVidia zealot and, my favorite, "paid NVIDIA shill". All are rather entertaining, quite frankly.

Yea but Wiitard takes the cake.
It's basically flawless.
Best video game insult ever.
 
I believe otherwise, and I'm fairly confident that many here welcome my "taking you down a peg", but that's a fair opinion to have.

I think you're assuming too much, and you know what they say about that ;).

I guess it depends, yeah. Some might take offense, others may not.

I've been called a nVidiot, a fanATic, fanb0y, fanbot, nVidia zealot and, my favorite, "paid NVIDIA shill". All are rather entertaining, quite frankly.

So have I over the years, guess what, who the heck cares? If that's all they can come up with to try and debase someone's arguments, they're doing pretty darn well.
 
Anyone have an idea as to when the NDA is lifted? Perhaps more appropriate: does anyone have any idea as to when people start getting samples so that I can pray that someone breaks the NDA?
 
CeBIT is when it's first being shown off so I'd imagine after that date people should have samples. When you can actually buy them or see the reviews I'm not sure. Anywhere from 1 week up to April 1st would make sense for the actual launch however.
 
What do you guys think the lowest wattage power supply that will run one of these things is? Would 550 do it?
 
Status
Not open for further replies.
Back
Top