Blackstone
2[H]4U
- Joined
- Mar 8, 2007
- Messages
- 3,583
Update/Disclaimer: I'm just a normal end user and these are just some subjective observations about the 2900XT after playing with it. The purpose of this thread is to collect data/opinions on the card and to speculate how this card will perform in future games.
This morning I installed what I believe is Catalyst driver version 8.38 RC7 and took a walk through the woods in Oblivion @ 1680 x 1050, 4xaa and 16af forced at the driver level, with HDR enabled in the game. I have been testing informally and subjectively from the same save point in the game with different drivers over the past few days.
Is there an improvement from one driver to the next? The answer is no. This card simply chokes on scenes with a lot of trees with the above settings and the frame rate drops into "this is laggy" or what a bummer territory pretty consistently. This is the third driver I have tried and there is really no subjective improvement. If there is an improvement there I don't see it, or at least it isnt meaningful. The issue is clearly the trees, specifically, the tops of the trees with all the leaves. That is without a doubt what the card is struggling with at 4xaa. It is very disappointing.
Note that the card performs better in the Shivering Isles expansion pack because large oversized mushrooms replace a lot of the treesat least in the areas I have played through. Mushrooms do not have lots of leaves to render, obviously, so that makes sense.
Also note that in Battlefield 2, you would expect the card to handle such an old game with ease, but 8x antialiasing is also laggy when there are lots of trees with lots of leaves. On most maps, 8xaa is not really an option. As this is the only in game option higher than 4x available to me (and forcing at the driver level doesnt work for me) this card does not yield any gameplay advantage over my old x1800XT except for better framerates. There is no image quality improvement.
In both games if you pan down and just look at the grass the frame rate jumps up, but when you pan back up so that lots of trees are in the frame the frame rate slows to a crawl. If you turn the grass distance all the way down in Oblivion, you still get laggy frame rates because of those trees with 4xaa.
At first I couldn't figure out why some reviews of this card were so positive while [H]'s review was so harsh. I think I get it now. [H]'s review, by stressing maximum playable settings as opposed to frame rates and resolutions, highlights what this card's weakness isantialiasing. Antialiasing, to me, is a critical feature, because my monitors native resolution is 1680 x 1050, which is high, but not high enough to make going without AA an option. It is a must have feature for me, and I suspect it is a must have feature for a lot of other gamers who are similarly situated.
Other reviews tend to do no aa or 4xaa and then compare frame rates. Some reviews do some super high resolutions as well. After playing around with the card myself, I think [H]s methodology is the best and their conclusions are also pretty accurate.
At the end of the day, the only thing that matters is the maximum playable settings that a card can provide in the games that you play. I have an image quality standard that I require of all games without exception1680 x 1050 with 4xaa and 16xaf, 50 fps average. This card doesnt cut itnot for Oblivion at least.
The HD2900XT has a lot going for it but for some reason I cannot explain, ATi gimped this card with respect to antialiasing. If I understand the hardware correctly, the 2900 has 16 ROPs, the GTS has 20 ROPs, and the GTX has 24. My understanding of the architecture of these cards is limited, but my understanding is that the ROPs are the part of the card that has the biggest impact on AA performance. There is just no way around the fact that the card is lacking in this department and it shows in games.
It also has less texture units. As [H] explained:
The point Im trying to make is that as an ordinary enthusiast/gamer, I think [H]s analysis of this card is correct. Some of the other reviews and especially the 3DMark benchmarks are very misleading, because the fact of the matter is that once you enable AA this cards performance takes a giant nose dive.
When I think about how much foliage will be in Crysis (not to mention the inevitable next installment of the Battlefield series) this card just seems like too much of a gamble. I am also very concerned that with Crysis still a good 6 months away (I think) a new ATi card will appear with more ROPs and texture units.
I dont doubt that this card does some things better than the GTS (geometry?). It is also possible this card will perform better with DX10 games. But I simply do not see how this card will ever close the gap with the GTS with respect to antialiasing. I have never made an upgrade that yielded this little in terms of actual image quality.
For these reasons I will be returning the HD2900XT. What I replace it with is still up in the airprobably a GTX.
Disclaimer: I am just an end user. I am not involved in the computer industry in any professional capacity.
This morning I installed what I believe is Catalyst driver version 8.38 RC7 and took a walk through the woods in Oblivion @ 1680 x 1050, 4xaa and 16af forced at the driver level, with HDR enabled in the game. I have been testing informally and subjectively from the same save point in the game with different drivers over the past few days.
Is there an improvement from one driver to the next? The answer is no. This card simply chokes on scenes with a lot of trees with the above settings and the frame rate drops into "this is laggy" or what a bummer territory pretty consistently. This is the third driver I have tried and there is really no subjective improvement. If there is an improvement there I don't see it, or at least it isnt meaningful. The issue is clearly the trees, specifically, the tops of the trees with all the leaves. That is without a doubt what the card is struggling with at 4xaa. It is very disappointing.
Note that the card performs better in the Shivering Isles expansion pack because large oversized mushrooms replace a lot of the treesat least in the areas I have played through. Mushrooms do not have lots of leaves to render, obviously, so that makes sense.
Also note that in Battlefield 2, you would expect the card to handle such an old game with ease, but 8x antialiasing is also laggy when there are lots of trees with lots of leaves. On most maps, 8xaa is not really an option. As this is the only in game option higher than 4x available to me (and forcing at the driver level doesnt work for me) this card does not yield any gameplay advantage over my old x1800XT except for better framerates. There is no image quality improvement.
In both games if you pan down and just look at the grass the frame rate jumps up, but when you pan back up so that lots of trees are in the frame the frame rate slows to a crawl. If you turn the grass distance all the way down in Oblivion, you still get laggy frame rates because of those trees with 4xaa.
At first I couldn't figure out why some reviews of this card were so positive while [H]'s review was so harsh. I think I get it now. [H]'s review, by stressing maximum playable settings as opposed to frame rates and resolutions, highlights what this card's weakness isantialiasing. Antialiasing, to me, is a critical feature, because my monitors native resolution is 1680 x 1050, which is high, but not high enough to make going without AA an option. It is a must have feature for me, and I suspect it is a must have feature for a lot of other gamers who are similarly situated.
Other reviews tend to do no aa or 4xaa and then compare frame rates. Some reviews do some super high resolutions as well. After playing around with the card myself, I think [H]s methodology is the best and their conclusions are also pretty accurate.
At the end of the day, the only thing that matters is the maximum playable settings that a card can provide in the games that you play. I have an image quality standard that I require of all games without exception1680 x 1050 with 4xaa and 16xaf, 50 fps average. This card doesnt cut itnot for Oblivion at least.
The HD2900XT has a lot going for it but for some reason I cannot explain, ATi gimped this card with respect to antialiasing. If I understand the hardware correctly, the 2900 has 16 ROPs, the GTS has 20 ROPs, and the GTX has 24. My understanding of the architecture of these cards is limited, but my understanding is that the ROPs are the part of the card that has the biggest impact on AA performance. There is just no way around the fact that the card is lacking in this department and it shows in games.
It also has less texture units. As [H] explained:
Further:The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock. It seems that ATI is focusing more on shader processing like they did with the Radeon X1K architecture. The GeForce 8800 GTS and GTX seem to have much higher texture filtering performance available.
I have come to the conclusion, and I could be wrong, but I believe that this is the reason why the GTS and GTX perform so much better in Oblivion. I have not used either of the Nvidia cards personally so I cant comment on them really. I find it hard to believe that any driver update is going to rectify this kind of poor performance. I think future drivers will improve the cards performance, but unless someone can explain to me how a driver update can close the gap with respect to ROPs and AA performance, this card is going back to Newegg.There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
The point Im trying to make is that as an ordinary enthusiast/gamer, I think [H]s analysis of this card is correct. Some of the other reviews and especially the 3DMark benchmarks are very misleading, because the fact of the matter is that once you enable AA this cards performance takes a giant nose dive.
When I think about how much foliage will be in Crysis (not to mention the inevitable next installment of the Battlefield series) this card just seems like too much of a gamble. I am also very concerned that with Crysis still a good 6 months away (I think) a new ATi card will appear with more ROPs and texture units.
I dont doubt that this card does some things better than the GTS (geometry?). It is also possible this card will perform better with DX10 games. But I simply do not see how this card will ever close the gap with the GTS with respect to antialiasing. I have never made an upgrade that yielded this little in terms of actual image quality.
For these reasons I will be returning the HD2900XT. What I replace it with is still up in the airprobably a GTX.
Disclaimer: I am just an end user. I am not involved in the computer industry in any professional capacity.