ATI Radeon HD 2900 XT @ [H]

Status
Not open for further replies.
Honestly if they had said the x2900xt is under performing currently or as of right now the x2900xt is showing disappointing performance or the x2900xt is far from what we expected. I would have agreed with the statement. But saying the product is a "Flop" is like saying it allready failed that there is no hope, that you shouldn't even consider buying it now or down the road, I just think it gives a wrong impression on readers.

That's all they can say though - the card is significantly delayed and launched a long time after the nVidia offering - Yet it still fails to match up to them. It's reasonable to say that yes, it could have a large increase in performance eventually, but is it worth stating that?

Down the line, if there is a SIGNIFICANT change, then I'm sure [H] will have an article about - Even if they didn't, the information will be out there. As it is now, the product is a flop because it fails to make any impact against the rival offering and there is no reason in terms of game performance for the card to be chosen, and coming out of the gate strong, especially when you're coming up against the biggest and baddest out there, is important.

The product could pick itself up, but at this point in time there is no reason for it to be considered if you're looking for a better gaming experience - therefore, in its primary goal, it is a flop.
 
The price will fall down to its place in the performance bracket. It won't sell as much but you can't call it a flop yet.
 
Very nice review.

Some what disapointed that the 2900xt doesnt perform better in DX9 games, but all in all I am still buying atleast one. What I find funny people post about the 2900 and 8800's performance like they had a direct hand in it. People that score the touch down get to do the end zone dance, all we do is buy a ticket and go to the game.

The FX series didnt kill Nvidia (5500, 5700, 5900 and finally 5950), so what makes people think that releasing a card like the 2900XT will kill AMD/ATI.
 
The price will fall down to its place in the performance bracket. It won't sell as much but you can't call it a flop yet.

I'm pretty sure they meant that it's a flop performance-wise.

Very nice review.

Some what disapointed that the 2900xt doesnt perform better in DX9 games, but all in all I am still buying atleast one. What I find funny people post about the 2900 and 8800's performance like they had a direct hand in it. People that score the touch down get to do the end zone dance, all we do is buy a ticket and go to the game.

The FX series didnt kill Nvidia (5500, 5700, 5900 and finally 5950), so what makes people think that releasing a card like the 2900XT will kill AMD/ATI.

Actually the 5700/5900/5950 were decent cards, it's the 5600/5800 that flopped (I probably shouldn't have used that word, rofl). Also, I don't think that this will kill AMD/ATI, but in it's current financial state, it's the last thing they need.
 
Oh and comparing the architectures for DX10 gaming it looks like the best AMD could hope for is just under the 8800GTX. I would be VERY surprised if it were even slightly faster than the GTS/GTX overall in DX10. Drivers may help a bit but I would also be very surprised to see more than a 5%-10% increase which would still not be enough to justity one.

Plus it is all speculation since we aren't engineers at AMD and they haven't made any promises to this extent. You saw how well speculation worked for those fanboi's delaying GPU purchases for months.

This needs to drop in price to below that of a 8800GTS to even be considered.
 
ROFL, once you actually have the data in which, and the r600 outperforms the g80, then (and only then) you should carry forward your motion. Was [H] supposed to search for a game in which the r600 outperforms the g80 (if there is one)? It used it's usual suite of games, the SAME one they used for the last half year or so.

you should probably stop kissing ass, i'm sure Brent doesn't need you to defend him, i'll go find the data, and ill stick it in your noob face so u can stfu.
 
:(

wow ....just wow ...

I was considering getting the HD2900xt ..but not anymore


[F]old|[H]ard
 
The price will fall down to its place in the performance bracket. It won't sell as much but you can't call it a flop yet.

When it is $XXX maybe it will be a great buy. We will surely see. We will give it a shining review when it reaches that magical point in that it becomes a great value. Today @ $399, it is a flop.
 
you should probably stop kissing ass, i'm sure Brent doesn't need you to defend him, i'll go find the data, and ill stick it in your noob face so u can stfu.


You were warned not to do this. 1 month ban. Come back once the next catalyst drivers are out and we will talk about R600 again.
 
So tell me, would you purchase this video card right now over an 8800 GTS or GTX ?

QFT. I'd like to know the reasoning that anyone would purchase this card at MSRP over a GTS/GTX.

The only thing I can think of is wanting to run dual gpu and use an Intel mobo. Other than that I'm at a loss however.
 
Excellent review!
Ati/AMD efforts is totally disappointing. Lower image quality, greater current draw, higher price tag AND 6 months late! Makes me wonder if AMD buying ATi caused ATi problems. This review just fortifies my decision to buy a 8800GTX. The sad part is that I wanted the HD2900XT to turn some heads to drive NVidia prices down. The article seem a long though but I suppose this was required to satisfy the largest possible audience. Would have been nice to have seen some video play back comparisons.
 
This may be a bit of a crappy place to post this - but I'd like to give a small bout of thanks to [H] in how they do their reviews - This isn't exactly for a comparison of IQ or just sheer frame rate numbers - as there are a wealth of sites out there that provide this information, and frankly, for a much larger library of games.

That being said, I do love how the cards are not judged at the same settings for the bulk of the testing, and that is what makes these tests much more important for me - Seeing that resolution, AA, in-game settings are changed on a per card basis to attain the best playing experience is I'm sure, a very time consuming process, yet the benefits of this is that it gives a much easier way to determine the power of the cards, and a very well done over view of them, rather than just simply number crunching.

So yeah, thanks guys.
 
This card is hugely disappointing. I'm quite happy to "tough it out" with the X1950XTX until AMD/ATI can produce a proper card, one that actually performs better than its nVidia competition. :p
 
Although I wasnt too happy about some other reviews you guys wrote, you got this one right. The card is late,hot,power hungry and a bit lacking in the performance section.
I hoped it would run well since Im stuck with the 8800GTX and wanted a bit more power for my 30" LCD and I cant go SLI on my 975X.

Oh well....maybe next time AMD.
 
Nice review.. after reading it I started to think.

This card doesn't really badly outperform the high end 1950 cards, etc and it made me wonder.

How could a company with a huge budget for technology design a new high end offering, that doesn't really just whoop up on their old high end offering. It's not like they didn't test the card themselves against other offerings at various pricing points.

It is after this that I come to several possible conclusions;

A. The card performs extremely well at dx10, in comparison to dx9 which it only performs moderately. Then in a few months, it will be considered a bargain by your average shopper. And it may/may not perform the game it is being released with.

B. The card has lesser memory, therefor struggling to keep up with the higher cards, and the higher memory version will be more on par.

C. The card was released with buggy drivers doing things like grass, etc.. and they will be upgraded to take advantage of such details in various games that might offer it.

D. The card is being released highly degraded in performance due to the fear of excessive power consumption causing problem with your average persons systems. However, the card will be scaled up and released in several versions from companies like XFX, Sapphire, in which it will be ran at 100mhz-200mhz faster and be on par with its pricing point vs the GTS/GTX.

E. The card will overclock to much faster than overclocked GTS/GTX, therefor being used in the best of the best systems in which the person does not mind overclocking and/or running 700-800 watt power supplies (I run a 700 myself).

F. AMD/ATI smokes crack and the card is junk, and will be highly marked down on price to sell it, and the company will lose alot of money.

I tend to think that the card is not yet a flop in my eyes, simply because, I am not in the market for such a card just yet. I will be buying a new card around the release of Crysis for DX10, however, until then I love my 7900GS which has no probs running well over 600mhz on stock voltage, and likely will see voltage next week if I get around to ordering a vf900 :)

Competition is what the economy is about.. if AMD/ATI's card performs up to snuff and/or better with the GTX in dx10 applications, then there will be some price drops, in which case I'll buy the cheapest that'll get me dx10 in crysis at good detail at 1680x1050 :) Even if that means needing to overclock to get there. :)

Thanks for the review once again, and I really love this site!

Josh
 
Very nice review.

Some what disapointed that the 2900xt doesnt perform better in DX9 games, but all in all I am still buying atleast one. What I find funny people post about the 2900 and 8800's performance like they had a direct hand in it. People that score the touch down get to do the end zone dance, all we do is buy a ticket and go to the game.

The FX series didnt kill Nvidia (5500, 5700, 5900 and finally 5950), so what makes people think that releasing a card like the 2900XT will kill AMD/ATI.


I have owned 9700 and 9800's when the FX were out since they were not a good value for money nor were they desirable as a hardcore gamer.

I've had 7800/7900's when they were better and 8800's after that. Always going after the best performance and just buying the best at the time. Like the X1950 Pro AGP in this old HTPC system i'm on now. It's the best of what's out there. (for AGP)

People like us don't make up the majority of the real market, but some of the power / heat issues may cause AMD some issues getting these cards in say a Dell XPS, or a Best Buy computer, or the average retail customer's PC.

It's not going to kill them but they did not grab the performance crown this time and they failed in the market in ways the FX series didn't. Clueless consumers still bought the FX in drove and the OEM's scored deals on them and put them everywhere. This will happen with the likely good lower end AMD's but the higher end will likely be skipped by a lot of OEM's and consumers which will hurt AMD more than the FX hurt nVidia.

Hopefully AMD will come back with an outstanding product next time, or the consumer loses.
 
This may be a bit of a crappy place to post this - but I'd like to give a small bout of thanks to [H] in how they do their reviews - This isn't exactly for a comparison of IQ or just sheer frame rate numbers - as there are a wealth of sites out there that provide this information, and frankly, for a much larger library of games.

That being said, I do love how the cards are not judged at the same settings for the bulk of the testing, and that is what makes these tests much more important for me - Seeing that resolution, AA, in-game settings are changed on a per card basis to attain the best playing experience is I'm sure, a very time consuming process, yet the benefits of this is that it gives a much easier way to determine the power of the cards, and a very well done over view of them, rather than just simply number crunching.

So yeah, thanks guys.

You are quite welcome :)

It is hard work but we enjoy doing it and feel it is the best way to relate the gaming experience to our readers provided by each video card.
 
Drivers drivers drivers drivers drivers drivers. And, of course, its not in its native API.

Gawd i hope this card isnt as big a flop as it looks like its gonna be.

Im half expecting you guys at hard ocp to get an e-mail from AMD[ATI] saying:
"hahaHA! we tricked them dorks at NV harder then i banged.... *ahem* the card we sent you is an XL! heres the REAL XT, and heres the REAL XTX!"
 
I tend to think that the card is not yet a flop in my eyes, simply because, I am not in the market for such a card just yet. I will be buying a new card around the release of Crysis for DX10, however, until then I love my 7900GS which has no probs running well over 600mhz on stock voltage, and likely will see voltage next week if I get around to ordering a vf900 :)

Competition is what the economy is about.. if AMD/ATI's card performs up to snuff and/or better with the GTX in dx10 applications, then there will be some price drops, in which case I'll buy the cheapest that'll get me dx10 in crysis at good detail at 1680x1050 :) Even if that means needing to overclock to get there. :)

I see your point, but some people want (for example) to run STALKER/Oblivion with all the settings cranked up to max at very high resolutions today. I don't see any other cards that can do that besides the GF8 series. What ATI needs to capitalize now on is nvidia's lack of success in the mainstream. ATI has the best performing card under $200 (X1950XT) right now, and they need to keep it that way.
 
You can throw in all the misdirected emotion you want......

The card and / or driver set AS IT IS PRESENTED is a "flop". Thats it.. Bottom line..

It has been explained to you in several occasions that future driver releases will undoubtedly increase throughput and ability of this card evolution, but it's NOT HERE NOW.

Similar complaints were made with nVidia at the release of the 88xx series, and now 6 months later, I can honestly say I have seen a significant improvement, and hopefully Ati will be able to do the same.

Now, with that being said, the card and driver support is still 6 MONTHS LATE!... If you want, I'll repeat it again, but that fact is grossly obvious. Fanboi all you want, the fact remains, undisputed cold fact. They knew this, yet still told individuals (myself included) to wait it out, there will be a competitive product released. At this moment, that is totally wrong. I'm not saying that it couldn't change, BUT I feel that there is a large number of people (myself included) that chose to wait it out.

We are wrong and were mislead....

Now, let's add further pain / anger to this.... the XTX variant will be "released" in about another 2 months..... Why? This is piss poor product planning. A lot of individuals will be transitioning to the 8800GTX or a lesser variant by then, and has a proven track record. Unless there is a significant increase in ability (doubtful at this time, but who knows... ), those individuals will have changed over, and that equals further lost respect / money / whatever for the Ati/AMD gaggle.

This also has the potential to carry over to the Barcelona release (off topic, but those 2 companies have merged, and this is a hell of a initial impression to product ability / quality). A company is as good as it's product and reputation. This has created a large deficit in both.

I, like others chose to wait, for no good reason, but faith. That say that it has been tarnished is minimizing this release to a large degree. I'd rather shit in my hand, and have something more worthwhile to show off (and it's cooler temperature too) than the 2900XT in it's current evolution.

Please don't get me started on the whole power consumption issue... :(

You say this review gives a "wrong impression to readers"? Fine, go photoshop some results and see if that changes anything. I'm not saying that the future may, MAY brighten this product line, but first impressions go a long way, and this product release stinks of manure. Maybe than can grow a rose with it, but I'm probably not going to wait and see, and nor are others who have waited this long.
 
FYI, Thanks Kyle for telling it like it is. The HD 2900 XT is a flop.

Maybe drivers will make the card faster. But it's still an expensive 200W piece of garbage.

Lets say that drivers can make the HD 2900XT 20% faster (that's a VERY optimistic estimate). It's now 10% faster than the 8800GTS.

10% isn't much to give up for superior image quality, 100W less power usage (and less heat/noise), and $70 in your pocket. I'd do it, and I'm sure that most [H] readers would too.

Not to mention that you could have had that performance 6 months ago.

It doesn't matter if AMD can make the HD 2900XT a little faster than the 8800GTS with driver updates. Considering that it came out 6 months later, uses 100W more power, and is more expensive, it needs be SIGNIFICANTLY faster than the 8800GTS.

This thing is a turkey. If you're buying a $400 200W GPU (and the similarly expensive 650W+ PSU you're going to need to run just one of these cards), you clearly value performance over heat/power/noise. That's a fine place to be, but if you're there, why would you want anything less than top-of-the-line performance and image quality? You don't buy an expensive PSU and deal with noisy cooling to get upper mid-range performance. You should demand the best.

AMD's new card has all of the costs of buying the best GPU (high card price, high PSU requirements, large physical size, noisy cooling), but it doesn't have the performance to back it up.

This is the FX5800 all over again. It's late, power-hungry, hot-running, has poorer image quality, and is - at best - as fast as the 6-month-old competition.

AMD, we're ready for your 6800GT. It's time to get out of the fog and release something that's actually better than the 6-month-old NVIDIA product.
 
This may be a bit of a crappy place to post this - but I'd like to give a small bout of thanks to [H] in how they do their reviews - This isn't exactly for a comparison of IQ or just sheer frame rate numbers - as there are a wealth of sites out there that provide this information, and frankly, for a much larger library of games.

That being said, I do love how the cards are not judged at the same settings for the bulk of the testing, and that is what makes these tests much more important for me - Seeing that resolution, AA, in-game settings are changed on a per card basis to attain the best playing experience is I'm sure, a very time consuming process, yet the benefits of this is that it gives a much easier way to determine the power of the cards, and a very well done over view of them, rather than just simply number crunching.

So yeah, thanks guys.

lol
ahahaha, man you had me goin there at first. gotta hand it to you, you sure know how to throw dirt right back in their faces. very cheeky
 
How could a company with a huge budget for technology design a new high end offering, that barely outperforms their old high end offering. It's not like they didn't test the card themselves against other offerings at various pricing points.

The ASIC design is shit. This card is supposed to be running at 1GHz clock if not higher and due to terrible leakage issues they simply cannot get enough power to it in order to scale the frequency they need.

Make no mistake about it, this card has the technology it needs to compete against NVIDIA, it is just that this ASIC sucks. Maybe they will have better luck at 65nm if they try to respin it, but I am not sure that is in the cards....
 
Oh and even if it's the cat's meow at DX10 and the 8800 is an FX doing DX9 there will be a better AMD card AND a better nVidia card out by the time DX10 gaming performance is really an issue with buyers.

Crysis may or may not be be the "halo" of DX10 gaming... If it's not we will really need to see a nice library of DX10 games before it's really going to matter to anyone.
 
So tell me, would you purchase this video card right now over an 8800 GTS or GTX ?

As of right now No.

I'm most interested in the 8800GTS 320MB as of now out of all the DX10 cards. But then again everyone has different needs and preferences. And I wont be buying a DX10 video card until DX10 games are out, my 7900GT provides me enough power for all my DX9 titles.

I think I know where you are going with this and you are right when you say that the 8800GTS and the 8800GTX are better choices "As of now" over the x2900Xt. But does that mean the x2900xt is a flop necessarily, in my book that would mean its a complete failure and that it shouldn't be bought now or ever. But even now for certain people that want the HDMI with audio and the UVD it might still be buy, so its not a flop in every way. I really hope I didn't make you guys mad with my arguing but IMO it just seemed too harsh and left no room for hope. I just wanna give my opinion and I know you guys listen to the community and you change and improve things constantly with praise and even with criticism thats why you have one of the most successful and most valued sites around.

So thanks for actually respecting my opinion. ;)

Anyway I'm done now.
 
Drivers drivers drivers drivers drivers drivers. And, of course, its not in its native API.
So, why is it that people will defend AMD for shipping the card without completing drivers for said card? If the drivers suck, what does that say about their ability to control every aspect of the release process.

And yes, I know that nvidia's G80 drivers supposedly sucked to begin with as well. I say supposedly because Ive never had any issues with my drivers, but others have. Haven't tried vista with the card myself, either.

My point is simply that if they want to ship with less than optimal drivers, they cant bitch about less than optimal testing results.
 
Hey Brent/Kyle, are you guys into culinary arts much? You reported an exhaust heat of 181F out the back of your computer while you were running two 2900XTs in Crossfire mode. Can you do me a favor? At 181F, I think that would be a perfect temperature to slow cook chicken. Try it! Then post your results. If you can cook a delicious piece o' chicken, then maybe the 2900XT isn't a flop afterall. You can say it may not be the best performing GPU on the market, but at least it'll cook you dinner while you're waiting for the next frame to render!
 
mmm... chicken :p

I hope AMD releases a successful match to the G80 soon so nVidia will roll out the 8900's...
 
Hey Brent/Kyle, are you guys into culinary arts much? You reported an exhaust heat of 181F out the back of your computer while you were running two 2900XTs in Crossfire mode. Can you do me a favor? At 181F, I think that would be a perfect temperature to slow cook chicken. Try it! Then post your results. If you can cook a delicious piece o' chicken, then maybe the 2900XT isn't a flop afterall. You can say it may not be the best performing GPU on the market, but at least it'll cook you dinner while you're waiting for the next frame to render!


The exhaust was so hot you could not leave your hand in the stream. Hehe. Again, this was under stress testing. I could get the 8800 GTX up to 170F.
 
put some box in the back with a door. funnel the exhaust into the box and put a cake in there or the chicken that was mentioned. you now have yourself an easy bake oven-[H] style
 
I just want to sum up my thoughts. I hope AMD/ATI learns from this mistake and comes back with an awesome R700 GPU. Nvidia did this with the 6 series, and after meeting some of the folks behind ATI at a recent party, I don't see a reason why ATI shouldn't.

Signing off,
Fanboy of Competition.
 
I just want to sum up my thoughts. I hope AMD/ATI learns from this mistake and comes back with an awesome R700 GPU. Nvidia did this with the 6 series, and after meeting some of the folks behind ATI at a recent party, I don't see a reason why ATI shouldn't.

Signing off,
Fanboy of Competition.


I have to agree with you on this. I think the R700 is likely poised for greatness.
 
This card barely outperforms the high end 1950 cards, etc and it made me wonder.
...
It is after this that I come to several possible conclusions;
I'd add:
G - It matches up for crap on all the particular games Hard is using.

That isn't ment to knock Brent or his particular testing procedure, it is what it is. If those are the games you use and you intend to play them how Hard tests them then (and Hard's going to catch a fairly wide swath, and it is good that they provide another choice is what is covered) then it's pretty obvious you shouldn't be looking at a 2900XT. But I've been looking around and right now it is looking really wierd. Stuff just isn't making sense. Sometimes the XT bricks when you dial up the IQ settings, sometimes it holds an even keel, occationally it comes out relatively better (and yes, I'm talking in that nicely playable range of 30-50ish FPS range) than the GTS/GTX.

So I think with this card, more than usual, you really need to try find a review that matches what you intend to do. EDIT: And wait for some more data points, and make sure you know WTF the reviewer really was testing.

P.S. Speaking of which I haven't noticed a single Vista platformed review yet. Given the prevailing mood right now that could be a while coming. But I wish it wasn't. One game I am interesting in playing is Shadowrun and it is Vista-only (but not DX10).
 
Why were the Beta 8.37 drivers used instead of the release 8.36 drivers?
Beta drivers are just that BETA... They don't always improve performance, and sometimes are more buggy. I have seen this in many previous [H] reviews before, and it still perplexes me. Many times you do list the reason for the beta driver in the review.

I made a previous comment about the performance of this card, but I now see you used Beta drivers. Was there a reason you specifically used the beta drivers? Did you test with the 8.36 release drivers first, and determined that the 8.37 Beta's had more performance? I'm confused that you didn't mention this in the article...
There's buzz going around that the new Beta driver isn't stable. I hope it isn't the same beta driver used in the review.

I also find it odd there are a few reviews going around that show the 2900XT performing exactly where many believed it would a few weeks ago...ahead or equal to the 8800GTS. Can anyone confirm if these sites are using timedemo's and not actual gameplay?
http://www.techpowerup.com/reviews/ATI/HD_2900_XT/1
http://www.tweaktown.com/articles/1100/1/page_1_introduction/index.html
http://www.tbreak.com/reviews/article.php?id=511
 
8.37 drivers are the ones AMD has on their site for download right now, so what is the problem exactly?
 
Why were the Beta 8.37 drivers used instead of the release 8.36 drivers?
Beta drivers are just that BETA... They don't always improve performance, and sometimes are more buggy. I have seen this in many previous [H] reviews before, and it still perplexes me. Many times you do list the reason for the beta driver in the review.

I made a previous comment about the performance of this card, but I now see you used Beta drivers. Was there a reason you specifically used the beta drivers? Did you test with the 8.36 release drivers first, and determined that the 8.37 Beta's had more performance? I'm confused that you didn't mention this in the article...
There's buzz going around that the new Beta driver isn't stable. I hope it isn't the same beta driver used in the review.

I also find it odd there are a few reviews going around that show the 2900XT performing exactly where many believed it would a few weeks ago...ahead or equal to the 8800GTS. Can anyone confirm if these sites are using timedemo's and not actual gameplay?
http://www.techpowerup.com/reviews/ATI/HD_2900_XT/1
http://www.tweaktown.com/articles/1100/1/page_1_introduction/index.html
http://www.tbreak.com/reviews/article.php?id=511

So are you going to ask TechPowerUp why they were using 91.47s for the 8800 when 158.22s are available?
 
Status
Not open for further replies.
Back
Top