My experience with HD2900XT @ 1680 x 1050 w/ 4xaa

Blackstone

2[H]4U
Joined
Mar 8, 2007
Messages
3,580
Update/Disclaimer: I'm just a normal end user and these are just some subjective observations about the 2900XT after playing with it. The purpose of this thread is to collect data/opinions on the card and to speculate how this card will perform in future games.

This morning I installed what I believe is Catalyst driver version 8.38 RC7 and took a walk through the woods in Oblivion @ 1680 x 1050, 4xaa and 16af forced at the driver level, with HDR enabled in the game. I have been testing informally and subjectively from the same save point in the game with different drivers over the past few days.

Is there an improvement from one driver to the next? The answer is no. This card simply chokes on scenes with a lot of trees with the above settings and the frame rate drops into "this is laggy" or “what a bummer” territory pretty consistently. This is the third driver I have tried and there is really no subjective improvement. If there is an improvement there I don't see it, or at least it isn’t meaningful. The issue is clearly the trees, specifically, the tops of the trees with all the leaves. That is without a doubt what the card is struggling with at 4xaa. It is very disappointing.

Note that the card performs better in the Shivering Isles expansion pack because large oversized mushrooms replace a lot of the trees—at least in the areas I have played through. Mushrooms do not have lots of leaves to render, obviously, so that makes sense.

Also note that in Battlefield 2, you would expect the card to handle such an old game with ease, but 8x antialiasing is also laggy when there are lots of trees with lots of leaves. On most maps, 8xaa is not really an option. As this is the only in game option higher than 4x available to me (and forcing at the driver level doesn’t work for me) this card does not yield any gameplay advantage over my old x1800XT except for better framerates. There is no image quality improvement.

In both games if you pan down and just look at the grass the frame rate jumps up, but when you pan back up so that lots of trees are in the frame the frame rate slows to a crawl. If you turn the grass distance all the way down in Oblivion, you still get laggy frame rates because of those trees with 4xaa.

At first I couldn't figure out why some reviews of this card were so positive while [H]'s review was so harsh. I think I get it now. [H]'s review, by stressing maximum playable settings as opposed to frame rates and resolutions, highlights what this card's weakness is—antialiasing. Antialiasing, to me, is a critical feature, because my monitor’s native resolution is 1680 x 1050, which is high, but not high enough to make going without AA an option. It is a must have feature for me, and I suspect it is a must have feature for a lot of other gamers who are similarly situated.

Other reviews tend to do no aa or 4xaa and then compare frame rates. Some reviews do some super high resolutions as well. After playing around with the card myself, I think [H]’s methodology is the best and their conclusions are also pretty accurate.

At the end of the day, the only thing that matters is the maximum playable settings that a card can provide in the games that you play. I have an image quality standard that I require of all games without exception—1680 x 1050 with 4xaa and 16xaf, 50 fps average. This card doesn’t cut it—not for Oblivion at least.

The HD2900XT has a lot going for it but for some reason I cannot explain, ATi gimped this card with respect to antialiasing. If I understand the hardware correctly, the 2900 has 16 ROPs, the GTS has 20 ROPs, and the GTX has 24. My understanding of the architecture of these cards is limited, but my understanding is that the ROPs are the part of the card that has the biggest impact on AA performance. There is just no way around the fact that the card is lacking in this department and it shows in games.

It also has less texture units. As [H] explained:
The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock. It seems that ATI is focusing more on shader processing like they did with the Radeon X1K architecture. The GeForce 8800 GTS and GTX seem to have much higher texture filtering performance available.
Further:
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
I have come to the conclusion, and I could be wrong, but I believe that this is the reason why the GTS and GTX perform so much better in Oblivion. I have not used either of the Nvidia cards personally so I can’t comment on them really. I find it hard to believe that any driver update is going to rectify this kind of poor performance. I think future drivers will improve the card’s performance, but unless someone can explain to me how a driver update can close the gap with respect to ROPs and AA performance, this card is going back to Newegg.
The point I’m trying to make is that as an ordinary enthusiast/gamer, I think [H]’s analysis of this card is correct. Some of the other reviews and especially the 3DMark benchmarks are very misleading, because the fact of the matter is that once you enable AA this card’s performance takes a giant nose dive.
When I think about how much foliage will be in Crysis (not to mention the inevitable next installment of the Battlefield series) this card just seems like too much of a gamble. I am also very concerned that with Crysis still a good 6 months away (I think) a new ATi card will appear with more ROPs and texture units.
I don’t doubt that this card does some things better than the GTS (geometry?). It is also possible this card will perform better with DX10 games. But I simply do not see how this card will ever close the gap with the GTS with respect to antialiasing. I have never made an upgrade that yielded this little in terms of actual image quality.
For these reasons I will be returning the HD2900XT. What I replace it with is still up in the air—probably a GTX.

Disclaimer: I am just an end user. I am not involved in the computer industry in any professional capacity.
 
Excellent real-world summary! Thanks for the compliments. From testing it seems this video card currently has a problem with performance taking a dive with AA enabled, which is the opposite that you would expect considering it has a 512-bit memory bus and a high level of memory bandwidth available.

You bring up a good point about Crysis and the heavy usage of vegetation in that game, it certainly will be interesting to see how they compare.
 
From what I read around review sites, the problem with this card is it has "programmable" AA, whatever that means, which probably means they really have to tweak drivers to get full performance out of it, or else how do you explain that the card has almost double the bandwidth of the X1950XTX and loses alot more performance? Makes NO sense! I wouldnt be so eager to sell the HD2900XT since I still believe alot of people are gonna be surprised, but its your choice anyway
 
Excellent real-world summary! Thanks for the compliments. From testing it seems this video card currently has a problem with performance taking a dive with AA enabled, which is the opposite that you would expect considering it has a 512-bit memory bus and a high level of memory bandwidth available.

You bring up a good point about Crysis and the heavy usage of vegetation in that game, it certainly will be interesting to see how they compare.

Well, the thing is, if I knew why it takes such a big performance hit with aa, I would be in a better position. Is it because of the lack of texture units or ROPs? The ROPs seem to be the prime suspect. But there is also a difference in the amount of video memory to consider as well.

I wish somebody with some technical expertise would try and get ATi to explain why they went with less ROPs and texture units. Was this a cost cutting measure or does this reflect foward looking thinking that will pay off with future games? Or could they not successfully implement more units?

I would rather hold on to this card because I have a crossfire motherboard, but I just find it hard to believe this is a driver problem and is going to be resolved somehow.

Also with respect to the narrow and widetent modes, I tried all of these modes in Oblivion and I still think the standard 4x box setting gives the best image quality results. With the narrow tent setting it claims you are getting 4x samples and the framerates are much better than 4x boxed, but it just doesn't look as good.

By the way, I really don't want to return this card. I have had an ATi card since the 9700 pro. (9700pro-9800pro-X800xl-x1800xt) It really overclocks well and the build quality is good. I also really like the cooler. I don't think I can keep it unless the aa issues are resolved.

Is it even plausable that a driver could resolve this issue? I don't see how it can catch up to the GTS with aa having less ROPs.
 
AWESOME REVIEW!!!!!

Thanks for that information. I am kind of sad to see the new ATI card perform like that. :confused:

I use to own a Radeon 9800 Pro 256 back in the day and I loved that card. At the time, every game that was out had no problems what so ever. I use to play RTCW all the time and I would always push 75+ fps. It would just crunch through 3D (Open GL and DX) at any moment with max settings.

Oh well, Nvidia is the ruler of the roost now. I will go SLI with my 8800 soon enough, but one card is still good for me. Bring on UT2007 with all of its DX-10 goodiness and PhysX. I really still wanna see that work well...my card is sitting on a shelf. :eek:
 
On the 8800 GTX, I'm basically running around with maxed settings in everything right now...and it sounds like my old 7800 GTX did better than you're describing. I was really hoping the graphics fight would heat up more, but it's not looking that way. :(

I still won't count ATI out of the (more financially significant) mid-range fight, but I won't be holding my breath over a driver-based miracle on the 2900XT.
 
On the 8800 GTX, I'm basically running around with maxed settings in everything right now...and it sounds like my old 7800 GTX did better than you're describing. I was really hoping the graphics fight would heat up more, but it's not looking that way. :(

I still won't count ATI out of the (more financially significant) mid-range fight, but I won't be holding my breath over a driver-based miracle on the 2900XT.

I'm going to hold onto the card for a little while longer to see if there is an official driver release. If there is I'll see if there is any change.
 
@ OP. Thanks for sharing your experience. Even if you're not showing exact frame numbers and screen shots, your summary tells me much more than all the threads showing 3dmark scores.
 
Well, the thing is, if I knew why it takes such a big performance hit with aa, I would be in a better position. Is it because of the lack of texture units or ROPs? The ROPs seem to be the prime suspect. But there is also a difference in the amount of video memory to consider as well.

ROPs

I wish somebody with some technical expertise would try and get ATi to explain why they went with less ROPs and texture units. Was this a cost cutting measure or does this reflect foward looking thinking that will pay off with future games? Or could they not successfully implement more units?

They are taking the same stance they did with the X1K series, putting more focus on shader performance and stream computing because they feel this will be more important for performance in the future.

However, until games do away with textures, texture performance is still very important, what good are shaders to make textures look better if you don't have the texture there fast enough in the first place. Filtering performance is also another issue when you talk about the TMUs.

Also with respect to the narrow and widetent modes, I tried all of these modes in Oblivion and I still think the standard 4x box setting gives the best image quality results. With the narrow tent setting it claims you are getting 4x samples and the framerates are much better than 4x boxed, but it just doesn't look as good.

Yep, as we noted enabling the tent modes blurs textures.

Is it even plausable that a driver could resolve this issue? I don't see how it can catch up to the GTS with aa having less ROPs.

The architectures are so very different it is hard to tell which will be better for the future, ATI says theirs, NV says theirs. We just won't know until we try some of those "next gen" games.

Drivers can help, what will be important with the 2900 XT is getting that thread dispatcher working at peak efficiency, to properly schedule vertex and pixel data to the stream processors. It will be hard for them IMO to get this superscalar architecture working at full efficiency. NV couldn't do it with the NV30, we all know how that went.
 
Sorry to hear of your experience. AA is extremely important to me also and i play at the same resolution as you.

ATI's AA modes feel like they are going to go the way of nVidia's Quincunx, i.e. abandoned.
 
I have an image quality standard that I require of all games without exception—1680 x 1050 with 4xaa and 16xaf, 50 fps average. This card doesn’t cut it—not for Oblivion at least.

With those standards, I'm not sure why you gambled on an r600 in the first place. I would have recommended a GTX without hesitation based on the figures you are looking to obtain. In modern graphics engines you will be pushing the limits of rendering capacity.
 
Considering that next-gen DX10 slowed all these cards down to a crawl, I wouldn't get any currently, unless UT2k7 magically plays very well on either hardware.

Yes it would be interesting to test these "next-gen" cards with next-gen games, but we dont have the latter. What good are next-gen cards if you don't have a game to play?

For right now, I think NV has the edge. Hopefully when refresh parts come out or nextnext-gen cards come out, it would be a worthwhile purchase of the hardware. X1950XTX still holding strong in all the latests games I have tried(Oblivions sucks).

Shame though, the HD2900XT should have been a great card...
 
Well we don't have any GOOD DX10 games to test yet, and gameplay videos of Crysis look smooth in DX10. Console ports usually run like crap vs true PC games.
 
Well we don't have any GOOD DX10 games to test yet, and gameplay videos of Crysis look smooth in DX10. Console ports usually run like crap vs true PC games.

That is true that console ports do suck for the PC(COD2?). Thing is, even in the crysis gameplay footage there was a noticeable slowdown in different areas. Not sure on what hardware they ran, but surely not a single card that anyone possesses today.
 
I think they are using a Quad and GTX (maybe SLI?) not sure what Resolution they are running in either though.. someone have info from a gaming convention or something (for a new thread of course).
 
I think they are using a Quad and GTX (maybe SLI?) not sure what Resolution they are running in either though.. someone have info from a gaming convention or something (for a new thread of course).

It had to be SLI, did you see how good those demo videos looked?

Edit: I think I'm going with the GTX. It isn't that much more expensive. It will be my first Nvidia card since...I dunno the TNT2 Ultra or whatever came after that.

By the way, does anyone know how much performance the GTX gained from its first driver release to the drivers that are out now? Are they good about updating their drivers frequently?
 
My Oblivion results with 3.38 RC7 drivers (didnt run these with 3.37 so dont ask)


ALL Taken at 1680x1050 with all setting at MAX. TA turned off on AA settings, Box AA chosen
___________________________________________________________________
Oblivion Crossfire 16AA 16AF HDR ON
Avg: 11.548 - Min: 7 - Max: 41

Oblivion Crossfire 8AA 16AF HDR ON
Avg: 21.019 - Min: 11 - Max: 33

Oblivion Crossfire 4AA 16AF HDR ON
Avg: 39.429 - Min: 20 - Max: 62

Oblivion Crossfire 2AA 16AF HDR ON
Avg: 49.261 - Min: 30 - Max: 62

Oblivion Crossfire No AA 16AF HDR ON
Avg: 51.478 - Min: 30 - Max: 62
--------------------------------------------------------------------


Oblivion Crossfire 16AA 16 AF HDR OFF Bloom ON
Avg: 36.817 - Min: 24 - Max: 48

Oblivion Crossfire 8AA 16AF HDR OFF Bloom ON
Avg: 53.687 - Min: 31 - Max: 62

Oblivion Crossfire 4AA 16AF HDR OFF Bloom ON
Avg: 52.532 - Min: 29 - Max: 61

Oblivion Crossfire 2AA 16AF HDR OFF Bloom ON
Avg: 53.066 - Min: 33 - Max: 62


Crossfire Turn OFF
______________________________________________________________________

Oblivion 2AA 16AF HDR OFF Bloom ON
Avg: 39.642 - Min: 24 - Max: 54

Oblivion 4AA 16AF HDR OFF Bloom ON
Avg: 39.523 - Min: 26 - Max: 56

Oblivion 8AA 16AF HDR OFF Bloom ON
Avg: 37.921 - Min: 24 - Max: 50

Oblivion No AA 16AF HDR Off Bloom ON
Avg: 40.658 - Min: 26 - Max: 57
-----------------------------------------------------------------------

Oblivion 8AA 16AF HDR ON
Avg: 11.138 - Min: 5 - Max: 25

Oblivion 4AA 16AF HDR ON
Avg: 23.054 - Min: 14 - Max: 36

Oblivion 2AA 16AF HDR ON
Avg: 32.372 - Min: 17 - Max: 49

Oblivion No AA 16AF HDR ON
Avg: 39.896 - Min: 19 - Max: 61
 
I just found this explanation as to why the R600 was designed the way it was and why the aa performance is so lackluster right now:

As most gamers will want AA and AF enabled in games, the HD 2900XT's poor performance with these processing options enabled is a serious problem for the card and ATi. We asked ATi to comment on this surprising result and the company revealed that the HD 2000-series architecture has been optimised for what it calls 'shader-based AA'. Some games, including S.T.A.L.K.E.R., already use shader-based AA, although in our tests the 640MB 8800 GTS proved to be faster than the HD 2900XT.

We asked Richard Huddy, Worldwide Developer Relations Manager of AMD's Graphics Products Group, to go into more detail about why the Radeon HD 2000-series architecture has been optimised for shader-based AA rather than traditional multi-sample AA. He told us that 'with the most recent generations of games we've seen an emphasis on shader complexity (mostly more maths) with less of the overall processing time spent on the final part of the rendering process which is "the AA resolve". The resolve still needs to happen, but it's becoming a smaller and smaller part of the overall load. Add to that the fact that HDR rendering requires a non-linear AA resolve and you can see that the old fashioned linear AA resolve hardware is becoming less and less significant.' Huddy also explained that traditional AA 'doesn't work correctly [in games with] HDR because pixel brightness is non-linear in HDR rendering.'

While many reviews of the HD 2900XT have made unflattering comparisons between it and Nvidia's GeForce 8800-series, Huddy was upbeat about AMD's new chip. 'Even at high resolutions, geometry aliasing is a growing problem that can only really be addressed by shader-based anti-aliasing. You'll see that there is a trend of reducing importance for the standard linear AA resolve operation, and growing importance for custom resolves and shader-based AA. For all these reasons we've focused our hardware efforts on shader horsepower rather than the older fixed-function operations. That's why we have so much more pure floating point horsepower in the HD 2900XT GPU than NVIDIA has in its 8800 cards... There's more value in a future-proof design such as ours because it focuses on problems of increasing importance, rather than on problems of diminishing importance."

http://www.custompc.co.uk/custompc/...ns-radeon-hd-2900xts-poor-aa-performance.html

I think you could read this as an argument that Oblivion performance is not going to be a good predicter of Crysis/UT3/Bioshock performance. Are they basically arguing that with future games 2900 series cards will be able to get rid of jaggies but in a different way? How are jaggies a problem of decreasing importance? I'm not saying I don't buy it I just don't get it. I don't get why the benefit isn't realized in Oblivion.

I would like to hang onto this card I'm just concerned that the 8800 series is handling the aliasing problem so much better in today's games. Very confusing. I think I need to find a less complicated hobby.
 
Here a very intelligent poster added this to my parallel thread on Tom's. Responding to ATi's explanation "The Great Ape" wrote:

That's what I was talking about, yet that shouldn't have kept ATi/AMD from putting in hardware resolve as well, since you can do both and the transistor cost is minimal. Some reviewers aren't buying that they didn't do both and they just have a broken back end as well. I'm not convinced because to me it just looks like an extnesion of ATi's mis-calculation of the speed to these requirements being exploited.

If you think about the idea of what they're looking towards and consider the R600 as a design that's supposed to be in the beginning of DX10 games not at the pre-DX10 era, this shader based AA feature that's a requirement for D3D10.1 is a good approach, but right now it's just not cutting it compared to the dedicated hardware in the X1900 and GF80.

...I believe that ATi was so myopic as to just do the shader based resolve and not have a backup plan for the legacy support for performance.

...you can't apply AA corectly to certain situations (it breaks or reduces the effects), tey point out HDR as an example, and it makes sense, although I doubt people notice the difference when they've been doing it over the past year. A few other issues are geometry shaders, soft shadows, and displacement maps, all of which pose AA problems because of the way the effect is handled in the shader and how appying AA outside the shader would essentially distort the effect.

upposedly it will be available if the game has the option for it, and also offer it as a forced feature for some titles (similar to how the chuck patch was implemented) but it's still early on to tell. Really it's shouldn't be difficult for the devs to implement since it doesn't require much work on their part since the R600 should handle the work, it just needs the queue to know it can do it.

t seems inefficient now, but it is obviously the way to go in the future, but it's still not practical yet, which sucks especially if we think it will be useful in the near future. D3D10.1 requires this shader based AA option, but I still think ATi shoud;ve designed for both for now, then drop hardware resolve later. They may have and it's borked, but it just sounds like they took another leap of faith like they did with the X1900 shader imbalance, and this time they missed for that aspect.


Can the 8800 series do all this programmable shader stuff as well?

By the way I'm just doing this because I'm sure other people are as confused about all this stuff as I am. Anyone have any good links to "how graphics cards work" guides/FAQs?
 
Blackstone, your research has also helped me understand why the HD2900XT is sub par in the AA department.

These reasonings lets me know that the new card still has something it can excel in, how ever it seems they optimized this card for something that we do not need it in yet, pretty much they went too far above the bar

either way I like the card, the GTX is obviously a better gaming experience right now and not for much more money, same with the GTS640, but its good to know the card still has qualities people can purchase it for.
 
Well Huddy's claims just do not cut it, it's all well and dandy to point out that current method of fixed hardware AA resolve won't produce optimal results in certain circumstances such as HDR or AA being applied to shader effects but then if R600 is really targetting next gen games with its supposedly higher arithemtic capability demands -hence R600 overblown ALU muscle- then why the hell is it struggling with doing AA reslove in its shader pipes now????? they think that DX10 games will demand less ALU utilization ?? doesn't make any sense to me.

The best speculation was put forward in the B3D R600 review, were they suggested that ROP AA resolve unit is just "broken" and thus ATI was left with no choice but doing it on the shaders.
 
I am not sure why your results of BF2 are different from mine but I have to first ask you to post your Specs (at OP). It's common knowledge that BF2 uses a lot of ram. If you want to up the graphics you better have at least 2 Gigs of ram. Also, there is 1 command line you enter called "game.lockfps 999". This unlocks frame rates above 100 FPS. Not that it would count under your case but it's always good to do when making a comparison like this.

Now that's out the way I want to introduce a flaw in BF2 that is also common knowledge (more so to the widescreen community). Any resolution above 1280x1024 and you stretch the view horizontally and vertically. This changes the field of view. With less field of view I cannot understand why you are getting decreased frame rates. Here look at the screens below by placing each screen in it's own tab. Then tab back and forth between the 2. Look at the red car to the far left to see how the field of view changes.

This image is at 1600x1200


This image is at 1280x1024

As you can see their is more field of view at 1280x1024 then at 1620x1200. And if you look at the upper left corner frame rates are different only because the options between the 2 are different (which are clearly shown in the photos). I show these pics to reflect field of view more so then eye candy. Maxing eye candy does not change the field of view (or decrease field of view). However, in the menu you can change view distance which is not the same as field of view (however, both remained at 100%).

Now for the setup:

Setup is shown here. However, I made sure that AF was set at 16x


Karkand

Look at the UAV in motion can decrease frame rates. Here I am at 64 FPS


Looking at a solider here at 80 FPS. This photo shows how high the frame rates can go.


In the grass at spawn point, 58 FPS


Wake Island

This photo shows 60 FPS in the mist of other players. Trees and grass...


This photo shows 70 FPS with a chopper trying to (thread the needle) down the dirt path. Trees, grass and foliage.


This photo is more so about the trees and foliage which is at 67 FPS.

In order to see the full size of these pics (1600x1200) you have to click on the thumbnails found in this post. Once the image shows up on your screen, click it again. For those using monitors less then 22" you will notice parts of the image missing from your screen.

Now Frame rates do fluctuate and I tried my best to get the worst case scenario to see if I can duplicate the OP's frame rates. In the end I was not able to. . I experienced no slow ups nor hiccups that would hinder me from playing the game.I see that you edit your post but I do recall you posting FPS in the teens. Which is why I decided to test this myself.

The purpose of this post is to show my results in BF2. There is always room for improvement however, I do not see anything in BF2 that warrants concern. I have examine frame rates and IQ in other maps and found nothing out of the ordinary with 8x aa and 16x af, However, BF2 does not have a jungle type map. The best map I could think of that would offer some sort of "jungle" environment is Wake Island. Other maps may have a few more/less trees here and there however, I have not experienced frame rates which was posted in the OP.
HD 2900XT
E6700
2 Gigs of Ram
 
My specs are in my signature but here they are again:

Silverstone TJ07 | PC Power & Cooling Silencer 750 Quad| E6600@ 3.4 Ghz | Tuniq Tower 120 | 2GB OCZ DDR2 1000 Cas3| ATI Radeon HD2900XT | NEC 20.1 Widescreen | X-Fi | Sennheiser HD590| B&W Nautilus 804 | Threshold T2 | Bryston 3B-ST|

By the way, what are YOUR specs?

Also, those are ugly screens--I see lots of aliasing. Are you forcing any settings at the driver level, because if you are, you might not be getting any AA at all. You can't force AA in BF2 with Catalyst. I'm not sure about filtering.

Finally, go find a map with lots of trees, look at the trees, and do it again, because that is what I am talking about. Looking at a dead soldier on the ground isn't going to bring it out. I know all about the field of view thing, that has nothing to do with what I'm talking about. I'm talking about trees and foliage. Karkand is the easiest map to run in the whole game. Go to FuShe Pass and try it again.
 
GREAT topic. This right here could explain everything - in preparing for the future, ATI has neglected the now. Keep us posted Blackstone, this is good stuff!
 
I am not sure why your results of BF2 are different from mine but I have to first ask you to post your Specs (at OP). It's common knowledge that BF2 uses a lot of ram. If you want to up the graphics you better have at least 2 Gigs of ram. Also, there is 1 command line you enter called "renderer.drawfps 1". This unlocks frame rates above 100 FPS. Not that it would count under your case but it's always good to do when making a comparison like this.

Now that's out the way I want to introduce a flaw in BF2 that is also common knowledge (more so to the widescreen community). Any resolution above 1280x1024 and you stretch the view horizontally and vertically. This changes the field of view. With less field of view I cannot understand why you are getting decreased frame rates. Here look at the screens below by placing each screen in it's own tab. Then tab back and forth between the 2. Look at the red car to the far left to see how the field of view changes.

This image is at 1620x1200


This image is at 1280x1024

As you can see their is more field of view at 1280x1024 then at 1620x1200. And if you look at the upper left corner frame rates are different only because the options between the 2 are different (which are clearly shown in the photos). I show these pics to reflect field of view more so then eye candy. Maxing eye candy does not change the field of view (or decrease field of view). However, in the menu you can change view distance which is not the same as field of view (however, both remained at 100%).

Now for the setup:

Setup is shown here. However, I made sure that AF was set at 16x


Karkand

Look at the UAV in motion can decrease frame rates. Here I am at 64 FPS


Looking at a solider here at 80 FPS. This photo shows how high the frame rates can go.


In the grass at spawn point, 58 FPS


Wake Island

This photo shows 60 FPS in the mist of other players. Trees and grass...


This photo shows 70 FPS with a chopper trying to (thread the needle) down the dirt path. Trees, grass and foliage.


This photo is more so about the trees and foliage which is at 67 FPS.

Now Frame rates do fluctuate and I tried my best to get the worst case scenario to see if I can duplicate the OP's frame rates. In the end I was not able to. . I experienced no slow ups nor hiccups that would hinder me from playing the game.I see that you edit your post but I do recall you posting FPS in the teens. Which is why I decided to test this myself.

The purpose of this post is to show my results in BF2. There is always room for improvement however, I do not see anything in BF2 that warrants concern. I have examine frame rates and IQ in other maps and found nothing out of the ordinary with 8x aa and 16x af, However, BF2 does not have a jungle type map. The best map I could think of that would offer some sort of "jungle" environment is Wake Island. Other maps may have a few more/less trees here and there however, I have not experienced frame rates which was posted in the OP.
HD 2900XT
E6700
2 Gigs of Ram


What resolution are you at in wake island? What driver are you using? I can tell you right now that I don't get those frame rates at 4x let alone 8x so something is wrong here. Do you have adaptive aa turned on?
 
The 8800gts 640mb did well at 4x TR MSAA & 16AF @ 1600x1200: http://www.hardocp.com/article.html?art=MTI4MSw0LCxoZW50aHVzaWFzdA==

I'm suprised the GTS got such a high score at that resolution. Nevertheless, that's still a lower framerate than the OP stated he was going for. To get that extra 10 frames would definitely require stepping up to a GTS.

The r600 won't average 50 fps @4xAA,16AF 1600x1200 in the latest DX9 engines, and it will average even less in the first DX10 engines. Half a year from now, or one year, the card's power in the latest games will decrease, not increase. "Future proofing" is not a real concept. Every high-end video card is intended to be the most powerful solution on the market at the time of it's release. If it is not, it is a failure. From most reviews, the r600 is roughly comparable to or inferior to a GTS. Since even a GTS won't give you the performance you're looking for, the r600 should be nowhere on your radar screen.

You have a shitload of options:
nVidia 8800 GTX from eVGA
nVidia 8800 GTX from BFG
nVidia 8800 GTX from XFX
etc...

The 8800 line by nVidia represents the single biggest leap in consumer hardware ever introduced. There has never been a wider generational gap -- not among graphics cards, CPU's, RAM, sound cards, disc drives, or any other computer components, period. It would not be a stretch to call the 8800 line the best consumer hardware components ever.

Get a GTX already and stop needlessly worrying about crap when you could be playing nearly every game fully maxed out.
 
I'm suprised the GTS got such a high score at that resolution. Nevertheless, that's still a lower framerate than the OP stated he was going for. To get that extra 10 frames would definitely require stepping up to a GTS.

1680 x 1050 with 4xaa and 16xaf, 50 fps average

Well the GTS from the review is running 1600x1200 which is quite a bit larger than 1680x1050, and is running higher quality AA as well. You could turn down some settings if the smaller res doesn't up the fps enough. Also there have been quite a few driver updates since that review.
 
Well the GTS from the review is running 1600x1200 which is quite a bit larger than 1680x1050, and is running higher quality AA as well. You could turn down some settings if the smaller res doesn't up the fps enough. Also there have been quite a few driver updates since that review.

Driver updates don't do anything, they are basically tech voodoo (oooh something changed, lets see if it "works better" now -- it works exactly the same but the end user convinces himself its better), 1680x1050 vs 1600x1200 is <10% diff, I have always considered them identical resolutions albeit in different formats (widescreen vs standard), and that benchmark is an anomaly or Oblivion doesn't measure up - should have tested in F.E.A.R. I have a GTS 640 and I don't even average 40 fps at 1280x1024 at 2xAA and 4xAF. 4xAAand16xAF is basically max settings, nobody uses anything higher unless they are running a tech demo. 40-50 fps is widely considered to be the "optimal" playing range for modern engines, as a general rule it takes a top of the line system to average 45 in the most modern engines. You'll never get over 60, which is why many new games are capped at that anyways. So basically this guy's minimum performance requirements require the most powerful GPU on the market today. Therefore I am telling him to get a GTX already and stop wasting his time with other things. Pretty simple, no? Second-best won't cut it and the r600 isn't even second best, as that honor goes to the venerable GTS640. So why is the r600 even in this discussion? I cannot figure this out. Would anyone like to comment on the 6800 Ultra? It was a powerhouse of a card...in 2004.
 
i don't even know 1620 x 1200 is a resolution you can pick to play 3D games these day, i thought only 1600 x 1200, 2MP is it? CMIIW
 
Well Huddy's claims just do not cut it, it's all well and dandy to point out that current method of fixed hardware AA resolve won't produce optimal results in certain circumstances such as HDR or AA being applied to shader effects but then if R600 is really targetting next gen games with its supposedly higher arithemtic capability demands -hence R600 overblown ALU muscle- then why the hell is it struggling with doing AA reslove in its shader pipes now????? they think that DX10 games will demand less ALU utilization ?? doesn't make any sense to me.

Well yeah it makes no sense being forward thinking (assuming that's not all PR spin) if you can't keep up in today's games. After all when you spend money today you don't want to have to wait 6 months to a year to see whether it was money well spent. By that time there will be cheaper and faster hardware anyway. Same thing happened with R300's DX9 abilities, NV40's SM3.0 features and R580's abundance of shader power. They were all rendered obsolete by the time they were necessary. At least R300 kicked ass in everything at the time and didn't depend on promises to prove itself worthy.
 
Also, those are ugly screens--I see lots of aliasing. Are you forcing any settings at the driver level, because if you are, you might not be getting any AA at all. You can't force AA in BF2 with Catalyst. I'm not sure about filtering.

qft. Those screens show TERRIBLE IQ.
 
Well yeah it makes no sense being forward thinking (assuming that's not all PR spin) if you can't keep up in today's games. After all when you spend money today you don't want to have to wait 6 months to a year to see whether it was money well spent. By that time there will be cheaper and faster hardware anyway. Same thing happened with R300's DX9 abilities, NV40's SM3.0 features and R580's abundance of shader power. They were all rendered obsolete by the time they were necessary. At least R300 kicked ass in everything at the time and didn't depend on promises to prove itself worthy.

Well said, my thoughts exactly. ATI should never have released this card in its current state. It was better when it was just late. They should've taken the time to fix whatever bottleneck(s) it has; broken or non-existant AA hardware, sub-par texturing, or whatever. Are some sales in the short term worth this fiasco? In many cases it's getting beaten by a 1950 XTX! What were they thinking?

It's a sad time for ATI fans.
 
Driver updates don't do anything, they are basically tech voodoo (oooh something changed, lets see if it "works better" now -- it works exactly the same but the end user convinces himself its better), 1680x1050 vs 1600x1200 is <10% diff, I have always considered them identical resolutions albeit in different formats (widescreen vs standard), and that benchmark is an anomaly or Oblivion doesn't measure up - should have tested in F.E.A.R. I have a GTS 640 and I don't even average 40 fps at 1280x1024 at 2xAA and 4xAF. 4xAAand16xAF is basically max settings, nobody uses anything higher unless they are running a tech demo. 40-50 fps is widely considered to be the "optimal" playing range for modern engines, as a general rule it takes a top of the line system to average 45 in the most modern engines. You'll never get over 60, which is why many new games are capped at that anyways. So basically this guy's minimum performance requirements require the most powerful GPU on the market today. Therefore I am telling him to get a GTX already and stop wasting his time with other things. Pretty simple, no? Second-best won't cut it and the r600 isn't even second best, as that honor goes to the venerable GTS640. So why is the r600 even in this discussion? I cannot figure this out. Would anyone like to comment on the 6800 Ultra? It was a powerhouse of a card...in 2004.

Umm OK?

I said the 640mb 8800gts would provide the settings he was asking for and showed proof by the testing from [H]. Sure he could spend the extra $200 for a GTX, but if he wants those settings for the GAMES he talked about, the GTS will fill his need.

And saying that driver updates don't do anything.. umm what?? Sure you won't get a huge gain, but small tweaks here and there will increase your FPS. Also some slight OCing on the card and it will run even better.
 
I must say that this is an excellent thread. As for the question about upgrading to 8800GTX or not, I would say go ahead and do it. The kind of games ATi is talking about are not even coming this year IMO. By the end of this year we are hearing about 1TFlop operations on the next nVidia card which would be approximately 2-3 times the speed of an 8800GTX (or so they say). Just grab the 8800GTX if you have the dough and enjoy the games till end of the year and upgrade on christmas or something.

If there is one thing I have learned all this time is that I always kept on waiting for the best card to come my way and the next price drop that I played all the good games on crappy hardware. By the time I actually had the next powerful card I had already finished up the game. Happened to me countless times so much so that I have now set a bar of 350-400$ for graphics every 6-8 months and enjoying games thoroughly while I can.

Best of luck on your research though but compared to an 8800GTX I doubt HD 2900 XT will show some colors before the year is over when new nVidia cards hit the market. You are already looking for a 6 month purchase, with nVidia's past experience I am guessing this is the time ATi would take to get their "shader based AA" working in drivers on games that won't be using it atleast in this year. Hence, the reason why an HD 2900 XT is a poor choice however you look at it. If it does not have the hardware power, it does not have the hardware power for current games... period.

I would like to add the same disclaimer as OP: I don't know much about architectures but from what I understand the above scenario is quite likely.
 
If there is one thing I have learned all this time is that I always kept on waiting for the best card to come my way and the next price drop that I played all the good games on crappy hardware. By the time I actually had the next powerful card I had already finished up the game. Happened to me countless times so much so that I have now set a bar of 350-400$ for graphics every 6-8 months and enjoying games thoroughly while I can.

Best of luck on your research though but compared to an 8800GTX I doubt HD 2900 XT will show some colors before the year is over when new nVidia cards hit the market. You are already looking for a 6 month purchase, with nVidia's past experience I am guessing this is the time ATi would take to get their "shader based AA" working in drivers on games that won't be using it atleast in this year. Hence, the reason why an HD 2900 XT is a poor choice however you look at it. If it does not have the hardware power, it does not have the hardware power for current games... period.

I would like to add the same disclaimer as OP: I don't know much about architectures but from what I understand the above scenario is quite likely.

I would return the thing today if Newegg would take it back. They won't give me a refund, only an exchange for an identical unit. I would have kept my X1800XT to begin with but it is broken and artifacts so I need a new card. I would just keep the 2900XT but I think it is overpriced for what it does.

By the way, I don't mean to suggest that the HD2900XT is a bad card--it is a big step up from my X1800XT, but it isn't as big a step as I was hoping for, and it is a smaller step than I am used to when I upgrade. The thing is, the reviews for this card are really misleading beacuse they downplay the AA performance hit issue. This card should really sell for more like $350...

As for waiting for games and playing games on crappy hardware, I totally understand 100%. THat is what I am trying to avoid, I want to have power to spare for Crysis. I'm not sure if any single graphics card will really do the job before Crysis comes out...possibly Nvidia's next batch. Graphics cards always lag behind the games.
 
I was under the impression that Nvidia cards couldn't do HDR+AA that was true in the 7XXX line. While my X1900 could.
 
Umm OK?

I said the 640mb 8800gts would provide the settings he was asking for and showed proof by the testing from [H]. Sure he could spend the extra $200 for a GTX, but if he wants those settings for the GAMES he talked about, the GTS will fill his need.

And saying that driver updates don't do anything.. umm what?? Sure you won't get a huge gain, but small tweaks here and there will increase your FPS. Also some slight OCing on the card and it will run even better.

I don't think that anyone in the history of computing has produced a benchmark showing a real world performance gain from a driver update. I've certainly never seen one, despite upgrading hundreds if not thousands of times. The day that a driver update transforms a GeForce 2 into a GeForce 4, performance-wise, is the day that I'll pay attention. That day might occur after the next ice age.

It's a written law of tech that "upgrades" shall either do nothing or make something worse. I have yet to find an exception.
 
Back
Top