ATI Radeon HD 2900 XT @ [H]

Status
Not open for further replies.
My question for [H] is why was the:
~EVGA NVIDIA nForce 680i SLI motherboard
used when it is clearly designed for SLI?

I understand wanting "equal" ground to test on, but maybe also providing a test on a non SLI board for some comparative test would be more beneficial? Just wondering.
 
Im not sold however...3D mark scores just dont matter to me these days.
I find this so hilarious hearing this now. I have always treated "3DMark" as some fancy French word that translated to "skip to the next page". :rolleyes: You mean a significant number of people actually did use that number as a meaningful guide to their purchases? :confused:
 
Save it all until AMD has mature drivers and a DX10 game is out. You people are so gullible, just believe anything Brent and Kyle say, amazing. Why cant anyone else see this simply a case of immature drivers?(http://www.theinquirer.net/default.aspx?article=39580) And im no noob ive been reading the [H] for long time, since 98. Brent, Kyle? or are you still in "favor"*coughbribe* of Nvidia. My advice to all you...Invest in AMD stock you’ll be glad you did.:cool:

you people are really starting to f***** piss me off. AMD threw a fucking part for [H]ard|OCP On thursday. STFU and face the facts that the 2900, as it stands RIGHT NOW, got beat.

Shenanigans! I call shenanigans.

This review calls down ATI on the price points (You can get an 8800 GTS for less!), but the 8600 reviews that were so glowing were conspicuously silent on the fact that 7950s were both more powerful than the 8600s and cost less. If you're going to dock one video card for being more expensive than another, more powerful card, you should dock ALL video cards for it, rather than singling out ATIs card and failing to mention it for one of nVidia's.

Yes, they called the 8800 Ultra the overpriced trash that it is, but that was so insanely, blatantly obvious as to not even require mention.

i JUST checked the conclusion page for the 8600GTS reviews, 7950 doesnt not exist on that page.

You need some sleep man. Unless you regularly make a first impression as a bit of jerk. (j/k...sort of)

you speak, Kyle listens and changes things if he feels you have made a point worth considering. Look at the evolution of [H]|Consumer. it was for the people COMPLETELY and the companies involved are the reason it is no longer with us. what more can you want?
 
Save it all until AMD has mature drivers and a DX10 game is out. You people are so gullible, just believe anything Brent and Kyle say, amazing. Why cant anyone else see this simply a case of immature drivers? And im no noob ive been reading the [H] for long time, since 98. Brent, Kyle? or are you still in "favor"*coughbribe* of Nvidia. My advice to all you...Invest in AMD stock you’ll be glad you did.:cool:

Should every reviewer have waited until ATI declared that the drivers didn't suck? How long should we wait for them to get their shit together? 3 or 4 more months? Any bad-mouthing of the 2900 is on ATI for sending them out knowing the drivers were super weak and their best driver and hardware effort was peaking at only sometimes getting bitchslapped by a cheaper card. That's what they get.
 
My question for [H] is why was the:
~EVGA NVIDIA nForce 680i SLI motherboard
used when it is clearly designed for SLI?

I understand wanting "equal" ground to test on, but maybe also providing a test on a non SLI board for some comparative test would be more beneficial? Just wondering.

whats wrong with that? i'm using a single card on my SLI board. it doesn't make any difference. the 2nd slot is disabled if its not being used....
 
Save it all until AMD has mature drivers and a DX10 game is out. You people are so gullible, just believe anything Brent and Kyle say, amazing. Why cant anyone else see this simply a case of immature drivers?(http://www.theinquirer.net/default.aspx?article=39580) And im no noob ive been reading the [H] for long time, since 98. Brent, Kyle? or are you still in "favor"*coughbribe* of Nvidia. My advice to all you...Invest in AMD stock you’ll be glad you did.:cool:

Ah, if you say you are no noob, I guess we must believe you, because we just believe anything you say, even though your first and only post is a collection of unsubstantiated wishful thinking, insults, and scurrilous accusations of fraud.

Please, go out to the store with the other partisans who refuse to remove their red-tinted glasses and buy ten 2900XT's today. That way nVidia will continue cutting prices on the truly good cards so that my upgrade package will be more affordable.
 
whats wrong with that? i'm using a single card on my SLI board. it doesn't make any difference. the 2nd slot is disabled if its not being used....

One of many straws the fanATIcs are clutching at today is the belief that nVidia motherboards somehow sabotage 2900XT performance.
 
Save it all until AMD has mature drivers and a DX10 game is out. You people are so gullible, just believe anything Brent and Kyle say, amazing. Why cant anyone else see this simply a case of immature drivers?(http://www.theinquirer.net/default.aspx?article=39580) And im no noob ive been reading the [H] for long time, since 98. Brent, Kyle? or are you still in "favor"*coughbribe* of Nvidia. My advice to all you...Invest in AMD stock you’ll be glad you did.:cool:
The fact is, the review is on release day. If the drivers are immature, Brent and Kyle will most likely re-visit the 2900XT again someday if the drivers make any TANGIBLE difference.

The other problem is the fact that the card has been overhyped, and ATI/AMD people have spun the PR for months about their better drivers. The product was late to market, and ATI does not have the luxury of not performing competitively on launch day with a late product. The market is very fickle about such things.

What I don't understand is the fact that the chip design has been done for a relatively long time, so ATI had a long time to program drivers. The last few re-spins of silicon were most likely to reduce power and heat consumption. So unless there was a fundamental design change recently that would have completely changed how the driver was programmed, ATI really has no excuse.
 
Kyle, what about the HDCP features of the card?

i've only seen guru3d mention the HDCP stuff, supposidlly the HD2900XT is also HDCP capable.. hav eyou guys had any chance for testing this? (is there even a way to test this at the moment?)
 
The only good thing I see from this is Nvidia will probably take its next swing soon by dropping 8800 prices to stonewall the R600 even more and it wont be long before the 8900 is on its way.

I doubt that...what's the use in dropping prices if your competitor's product isn't a terribly good one?
 
I truly don't see why it is that people cry "bias" or "bribe" when a product doesn't receive a glowing review. Some products are just crap, and that is how it is. The HD 2900 XT isn't exactly crap, but it isn't exactly golden either. HardOCP isn't in the business of calling a lemon an orange. We call it how we see it. It's as simple as that.

Perhaps ATI fan-boys want to be lied to and told that all is rosy and wonderful in the AMD/ATI GPU world. There seem to be plenty of review sites out there that will do that for you. I sincerely hope that the fan-boy is a dying breed, but the rational part of me knows that it is not.

The only "bias" we at HardOCP might have is that we prefer products that are superior to other products. Right now, the 8800 GTS and GTX are better than basically every thing else. Before now, there were ATI cards that were better than NVIDIA counterparts. They are supposed to compete with each other. One card is supposed to "lose" to another card. That is the way competition works. They can't both win.

~~~~~~~~~~~~~~~~~~~~~~~

For me, the power issue clinches it. Even if it matched or surpassed the 8800 GTS step-for-step (in price and performance), the power issue kills the product in my mind. I already spend too much money on electricity, and the Radeon HD 2900 XT does not want to help me with that problem. Add to that problem the lacklustre performance, ridiculous heat production, and the noise, and you have a product that utterly fails to produce any kind of result that satisfies my own wants and needs.
 
you speak, Kyle listens and changes things if he feels you have made a point worth considering.
Hey, I am giving him the benefit of the doubt that he listens by making that suggestion. ;) I attached that to only one of his posts but I wasn't basing that suggestion on a single post.....
 
I truly don't see why it is that people cry "bias" or "bribe" when a product doesn't receive a glowing review. Some products are just crap, and that is how it is. The HD 2900 XT isn't exactly crap, but it isn't exactly golden either. HardOCP isn't in the business of calling a lemon an orange. We call it how we see it. It's as simple as that.

If you actually have to point that out, we've got a serious problem here...:rolleyes:
 
One of many straws the fanATIcs are clutching at today is the belief that nVidia motherboards somehow sabotage 2900XT performance.

I ironically am no "fanboy", but just wondering if a different board would yield different results, better or worse. When I read reviews I like to know that all the things are taken to make a fair review, this review has showed me that the 2900XT has some life to match the 8kGTS 640, but is also a power hungry little machine making it not a good buy.
 
I ironically am no "fanboy", but just wondering if a different board would yield different results, better or worse. When I read reviews I like to know that all the things are taken to make a fair review, this review has showed me that the 2900XT has some life to match the 8kGTS 640, but is also a power hungry little machine making it not a good buy.

Motherboards do have an impact, but it is not generally significant. The best thing a motherboard can do for a video card is in the realm of higher overclocking. Brent and I have different motherboards for our testing rigs, and as such, we get slightly different figures when overclocking. Of course, the most important aspect of overclocking a video card is the video card itself. Some cards just won't do it, and some love it.
 
I doubt that...what's the use in dropping prices if your competitor's product isn't a terribly good one?
The 8800GTS 640 prices have already taken their serious dip earlier last week. Right now the ATI card doesn't reach GTX speeds with any sort of consistantly to justify a 1:1 price comparison there. So the GTX and the Ultra are still going to have the high-end monopoly premium. The GTS 320 and the 8600 and 8500 are going to keep their "low-end DX10" monopoly premium, which at the moment is looking absolutely worthless (pending results of that DX10 game demo coming out shortly), until the HD 2600 and HD 2400 models come out (next month?).
 
Motherboards do have an impact, but it is not generally significant. The best thing a motherboard can do for a video card is in the realm of higher overclocking. Brent and I have different motherboards for our testing rigs, and as such, we get slightly different figures when overclocking. Of course, the most important aspect of overclocking a video card is the video card itself. Some cards just won't do it, and some love it.

Thank you for the response, I figured it was not a huge difference if you went with a non-SLI board, but just wondering if the option had come up.
 
AMD will bring ATI to its feet. We cannot blame AMD for ATI's lack of committment. AMD recently took over, and R600 was long over due.
 
Kyle, what about the HDCP features of the card?

i've only seen guru3d mention the HDCP stuff, supposidlly the HD2900XT is also HDCP capable.. hav eyou guys had any chance for testing this? (is there even a way to test this at the moment?)

For HDCP to be needed, I believe ICT needs to be switched on the HD-DVD / Blu-Ray movie you watch - and as far as I'm aware, none of them currently use the ICT. So, there isn't really a way to test for it, atleast as far as I know.
 
They will release it this fall . The r650 is not delayed and is still on time according to leaks and sources on the web.

By that time 65nm r650 , 1gig of gdr at 1.4ghz or higher and other things will be avalible to ati and nvidia will have to answer back .

If the 80nm r600 is hitting 850mhz on stock cooling and no volt mod then we should be looking at 850mhz as the minimum of the r650 .

Of course we don't know what other improvements were supposed to be in the r650 to increase its performance over the r600 .

I'm sure when the r650 launches many of these driver issues will be gone and the true performance of the r600 will be seen .


I expect the r650 in august / sept

I'm planing on building my next rig in the fall for conan mmorpg

Dave Orton, is that you?

Oh, by the way, thanks for the laughs I've gotten from your feeble attempts at justifying the piss poor showing by the HD2900XT.
 
Let’s keep it to the point Commander Suzdal shall we? I added some subjective content, that’s my opinion, everyone’s entitled to it right Commander? And I am not arguing in favor of how the hard/software looks right now either. The fact is AMD was forced to put this piece of hardware out, Do you know even 1% of what it takes to merge two companies like that. no. its amazing they have a R600 so quickly(lol). My gripe is the way the review is handled and the message the reviewers are giving you mindless buyers. The R600 is more advanced than G80 and its that simple. The drivers will take advantage of the power soon enough. Oh and phide, obviously you are a noobster on the topic of stocks! lolZ http://www.theinquirer.net/default.aspx?article=39548
 
There is a dx10 demo game out tomorrow which is lost planet , lets see how it does with that.

I posted a link to another site that has benches with the X2900 and the 8800 series in Lost Planet. In short, the X2900 gets its ass handed to it by the GTS, averaging 19 FPS to the GTS' 39 FPS, and the GTX's 54 FPS. Some people are saying that this test might have been run with an early engineering sample of the card, or one of the versions still running older drivers with half the ring bus rmoured to be disabled, however.

http://www.pcgameshardware.de/?menu=browser&article_id=601352&image_id=654653
 
There have been a few people pointing out sites that show the 2900XT in a better light and I have checked a few out but the one thing that I have noticed is that those sites are showing you a maximum fps. What does it matter if for one second any of the cards compared rendered 50 billion frames what matters is continous frame rates through out gameplay which is what they do here at the [H]. A timedemo/flyby does not do that, now I don't know how long they play the games when they do reviews but it is nice to see them doing that. Personally I don't care one way or another about the 2900XT I bought an 8800GTX on day one and haven't looked back since. But I do hope ATI/AMD does better cause competition is a good thing and the real winners in the end is the consumer.
 
Incidentally, I don't think these cards are a total flop, for one reason: they match or even outperform the 8800GTX at ultra-high resolution or ultra-high AA settings. The verdict across the web seems to be finding the HD2900XT winning out at 2560x1600, 16xAA, and other extreme settings (probably because of the insane bandwidth of the card), and these types of settings weren't examined by [H]. So for someone who games on a 30" monitor, these cards may prove to be a terrific buy over nVidia's cards considering the price point.
 
Let’s keep it to the point Commander Suzdal shall we? I added some subjective content, that’s my opinion, everyone’s entitled to it right Commander? And I am not arguing in favor of how the hard/software looks right now either. The fact is AMD was forced to put this piece of hardware out, Do you know even 1% of what it takes to merge two companies like that. no. its amazing they have a R600 so quickly(lol). My gripe is the way the review is handled and the message the reviewers are giving you mindless buyers. The R600 is more advanced than G80 and its that simple. The drivers will take advantage of the power soon enough. Oh and phide, obviously you are a noobster on the topic of stocks! lolZ http://www.theinquirer.net/default.aspx?article=39548

Sorry, do you work for AMD? If not, how do you know what the drivers will do? Are you in the CIA's Remote Viewing program and have seen these miraculous performance increases from within the ATI headquarters?
 
Do you know even 1% of what it takes to merge two companies like that. no. its amazing they have a R600 so quickly(lol).
R600 was more or less complete before the merger, and ATi continued operations as they have done previously in Canada despite the merger. In reality, the buyout didn't truly affect R600.

The R600 is more advanced than G80 and its that simple.
Prove that.

Oh and phide, obviously you are a noobster on the topic of stocks!
Market fluctuations due to product announcements don't have any long-term bearing on future stock prices, kiddo. These fluctuations are very typical for days such as today. Continue to watch AMD and NVDA over the next two to three weeks, if you please.
 
Incidentally, I don't think these cards are a total flop, for one reason: they match or even outperform the 8800GTX at ultra-high resolution or ultra-high AA settings. The verdict across the web seems to be finding the HD2900XT winning out at 2560x1600, 16xAA, and other extreme settings (probably because of the insane bandwidth of the card), and these types of settings weren't examined by [H]. So for someone who games on a 30" monitor, these cards may prove to be a terrific buy over nVidia's cards considering the price point.

Link(s) please
 
Let’s keep it to the point Commander Suzdal shall we? I added some subjective content, that’s my opinion, everyone’s entitled to it right Commander? And I am not arguing in favor of how the hard/software looks right now either. The fact is AMD was forced to put this piece of hardware out, Do you know even 1% of what it takes to merge two companies like that. no. its amazing they have a R600 so quickly(lol). My gripe is the way the review is handled and the message the reviewers are giving you mindless buyers. The R600 is more advanced than G80 and its that simple. The drivers will take advantage of the power soon enough. Oh and phide, obviously you are a noobster on the topic of stocks! lolZ http://www.theinquirer.net/default.aspx?article=39548

This is obviously the first time that a product looked good on paper but didn't perform up to par :eek: :rolleyes: :confused: :p ;) :eek: :cool:

And just take a look at the 3DMark06 scores... which Kyle and Brent showed. It seems to be doing VERY well there, so it might not be "all in the drivers"...

There does seem to be a lot of untapped potential... but I'm not sure I would want that with all the heat and wattages being sucked down by it at it's already less-than-ideal performance. I was looking forward to seeing how this card did, and I can see that I'm happy with my 8800GTS, it's a good thing too, I thought that THIS card sucked back too much power...
 
Be careful with some of the other reviews....I have noticed some using AA/AF levels that don't necessarily match when doing apples/apples reviews.
 
I think the review should have also included a 1950 XTX in there.

Why? This is next-gen up against next-gen.... The 2900XT should be a lot better than the 1950XTX (some of the time).....
 
Incidentally, I don't think these cards are a total flop, for one reason: they match or even outperform the 8800GTX at ultra-high resolution or ultra-high AA settings. The verdict across the web seems to be finding the HD2900XT winning out at 2560x1600, 16xAA, and other extreme settings (probably because of the insane bandwidth of the card), and these types of settings weren't examined by [H]. So for someone who games on a 30" monitor, these cards may prove to be a terrific buy over nVidia's cards considering the price point.


I smell FUD, but would love to see your "proof" and exactly how that data was gathered.
 
Why? This is next-gen up against next-gen.... The 2900XT should be a lot better than the 1950XTX (some of the time).....



Why wouldn't you want to know, just for information purposes, how it performs against their best from the last generation?

From other reviews, it seems that there one or two cases where it performs around the same level as the 1950. Very strange...
 
I cannot believe that fan boys are already slating people who say that G80 is better than HD2900, they are desperately clinging on to a hope that it will improve.

We all know what will happen, there will be R650/R700 pre Christmas pissing lots of people who stumped up cash for R600 off. There will then be a cut price R600 which will actually perform staggeringly for its price point.

Honestly though ATI has "lost" the high end round this time....Done.

Conversely we have to see what performs best at the mid range to truly see who will make the big bucks here

f
 
What about 1600x1200 resolution ? You have 19in LCD & 24in LCD users but you completely missed the 20.1 & 21.3in LCD users with 1600x1200 resolution benchmarks!

Please revise!

Read the eval, we tested at 1600x1200.

Ok, I am no GPU expert... but on paper this thing looks awesome (spec wise)... even compared to the GTX part. Was it too early? Is it the drivers, bad implementation... what?

The ASIC just sucks basically at 80nm. It was intended for much higher frequencies but those cannot be realized yet.

i just want to know, is this card destined to be a Flop like windows ME?
or is there still a light at the end of the tunnel and hope for a ressurection?

With a die shrink and driver updates, it could have a bright future, like we said the Architecture is fine really, it can scale up quite high, the potential is there, but it is not being realized.

I think some of you guys look to damn close at the comparison shots. I mean there still images of a moving scene. what counts when it comes to AA and AF is how it looks in a living moving scene. I have seen cards that have alot of shimmering because of poor AF or AA, but the screen shots look awsome. the problem with this is is that it comes down to a subjective opinion. the reviewer must give us his opinion on how the image quality is on a real in game scene. You can sit and stare at a sceen shot for hours, but when you in a game you will have about 1/30th of a second to take in that picture or less before it is replaced with the next. The important part is how those 2 pictures work together!

Also low levels of AF can be seen better in a moving scene. when your walking down the road in World of Warcraft those AF bands really stand out as they move with you as your walking down the road, that is something no screen shot can show.

Of course our IQ analysis comes from seeing the game in live motion. It is impossible to show this though to everyone, the best we can do are screenshots right now. I wish everyone could see the games with the cards in person.

If nothing else, I suppose we can thank ATI for the humor of introducing an acronym like WTF AA to the lexicon.

Isn't that just hillarious, I was thinking when I first saw the names, what would we abrieviate it as in our reviews? Well, Narrow Tent Filtering would be NTF, and Wide Tent Filtering had to be WTF lol.

What kind of hit did the NTF have?

It is hard to picture in motion from stills obviously, and IQ can be a very subjective thing at the best of times, but stills with text are pretty easy to judge and I can see where the use of a WTF would be relatively limited in games and definately not for everyone's tastes even at that.

Box Filter = 99 FPS, NTF = 41 FPS, WTF = 38 FPS, this is at 1024x768 in HL2 using 8X AA

They will release it this fall . The r650 is not delayed and is still on time according to leaks and sources on the web.

By that time 65nm r650 , 1gig of gdr at 1.4ghz or higher and other things will be avalible to ati and nvidia will have to answer back .

If the 80nm r600 is hitting 850mhz on stock cooling and no volt mod then we should be looking at 850mhz as the minimum of the r650 .

Of course we don't know what other improvements were supposed to be in the r650 to increase its performance over the r600 .

I'm sure when the r650 launches many of these driver issues will be gone and the true performance of the r600 will be seen .

I expect the r650 in august / sept

I'm planing on building my next rig in the fall for conan mmorpg

It is interesting that you claim to know what GPUs and when those GPUs will be released when I don't even know.

Some reviewers like the new fsaa modes . One even states its better than nvidia's

The new FSAA modes may help on edges at high AA settings (which we noted) but they do blur everything else, so you have to decide if this is a tradeoff you are willing to make. I surely am not.
 
There does seem to be a lot of untapped potential... but I'm not sure I would want that with all the heat and wattages being sucked down by it at it's already less-than-ideal performance.
Agreed. Driver updates will probably help some, but not enough to justify buying it unless it drops BELOW the cost of a 8800GTS. And dealing with the heat and noise while gaming is another issue that MANY people just won't put up with. I'd rather have a slightly SLOWER card that is quieter than one that may perform slighly better but is distracting due to fan noise.
 
Why? This is next-gen up against next-gen.... The 2900XT should be a lot better than the 1950XTX (some of the time).....

I think that there is some merit to including older cards in these evaluations, but there is a big problem too.

I think that a lot of people are looking at upgrading from older video cards such as the ATI X1K series or NVIDIA 7 series, and including numbers from that generation can make it more easily apparent where the strengths and advantages are in the new generation.

On the other hand, we have done evaluations comparing previous generation technology to the GeForce 8800 series, so the data is there, if you do some cross-referencing.

On the down side, doing these evaluations takes an incredible amount of time and work. We simply do not have the time that we would need to include all of the hardware I would like in these articles. This is especially true on launch-day products. We get this hardware for a short amount of time before launch date, and as such, we don't have enough time to test half a dozen video cards before launch. I'm surprised Brent got these 3 done in time.

The sites that have 5+ video cards in their benchmarks are not doing you any favors. They are running benchmarks, timedemos, and very short in-game runthroughs for their testing. Do not be fooled; they are not thorough.
 
Status
Not open for further replies.
Back
Top