Oblivion 7900GTX512SC compared to x1900xtx tested!

AstroCat

Limp Gawd
Joined
Mar 9, 2006
Messages
151
Ok, we got the game and we have 2 boxes here, almost identical except one has the 7900gtx512Superclock and the other has the x1900xtx.

Both systems have new clean winxp pro sp2 installs.

System 1
AMD Athlon 64 4000+ Sand Diego (2.4ghz)
ASUS A8N-SLi Premium
2GB Corsair XMS Twinx2048-PC-32000c2 (2-3-3-6 1T)
ATI Radeon X1900XTX (650/775(1550)) - Cat 6.3
Enermax Noisetaker EG701AX-VE SFMA(24P) 600W
Maxtor DiamondMax 10 6L300S0 300GB 7200rpm SATA150
Audigy 2 ZS - latest drivers and latest OpenAL driver

System2
AMD Athlon 64 4000+ Sand Diego (2.4ghz)
ASUS A8N32-SLi Deluxe
2GB Corsair XMS Twinx2048-PC-32000c2 (2-3-3-6 1T)
EVGA 7900GTX512SC (84.20)
Seasonic s12-600W
Western Digital Caviar SE16 WD2500KS 250GB SATA3.0GB
Audigy 2 - latest drivers and latest OpenAL driver

Game Settings
Ultra High Settings on both computers
AA off
HDR on
1600x1200 and 1280x960

ATI:
Cat forced 8xHQ AF
CatAI-Standard
Mip High Quality
Vsync On and Off

Nvidia:
HQ mode, all opts. OFF
App select AF
Trilinear forced
ClampOn
Vsync On and Off

First off Self Shadows looks clunky on both and defaults to off in UHQ game mode.

The game runs faster on the ATI box plus the ATI can use 4x or 8xHQAF while the 7900 can't use any forced AF to keep the same FPS.

It looks good overall on both and is definitely playable on both like this. I can keep the FPS up over 30 most the time if I play at 1280x960. At 1600x1200 I get in to the 20's a little too much outside.

I think I will have end up with 1280x960 HDR no AA for best performance.

I have no loyalties, I'm just calling it like I see it. I'm interested on the how the Wed. Nvidia drivers help the 7900.

So far I'm really digging the game. :)
 
I don't know if I'd say they are both in similar computers. There are some differences to them. Harddrive, PSU, Sound, and Mobo.

But it's good to see some comparisons now.
 
What a surprise..not.

They say the 360 version is super nice too, at the highest or nearly level of detail compared to PC. Though I'm sure it's 30 FPS sometimes 20's.
 
I would run them both in the A8N32 rig and test them on the same platform. Run driver cleaner between installs of the video card just to keep the test as untainted as possible without reloading completely for each video card. That would be more accurate.

Not that I don't believe the results, clearly there will be games that favor one cards architecture over the other.
 
Well they are pretty darned close, and if anything the Nvidia system is a little better. Faster HD and MB.
Overall, same CPU, mem, clean installs...etc.

I didn't mean this to be a formal test. I just wanted to share the results from some quick testing at my house. :)

ATI inside is from mid 30's to 80's and outside 20-50's

Just trying to share the info. :)
 
I am surprised there have been no official benchmarks on any site yet. Is there an NDA up until tomorrow or something?

I am curious about:

how much performance increase dual core offers
whether 1 GB vs 2 GB RAM makes a difference
and of course 7900GT vs X1900XT vs 7900GTX vs 7900GT SLI

so I can actually figure out which card to buy.
 
just to make sure I'm not misreading, the nvidia is a single card correct? (trying to figure out how my box I'm building will run)
 
Jared701 said:
just to make sure I'm not misreading, the nvidia is a single card correct? (trying to figure out how my box I'm building will run)

Yes 1 EVGA 7900512GTXSC and 1 ATI x1900xtx :)
 
I've been hearing a lot about HDR and AA not coexisting even for ATI cards. Can you confirm or deny this?

*Edit: Uh, I see it was turned on there with AA, but I'd just like a unmistakable confirmation on this if you would :p
 
Dan_D said:
Not that I don't believe the results, clearly there will be games that favor one cards architecture over the other.

I doubt we will see another major game where the 7900 is faster.

For that to happen youd need to be..optimizing your game for accessing textures a crapload or something..that's the only place where 7900 is faster.
 
I read ATI is working on Oblivion drivers too.

Anyways, I hope Nvidia isn't cheating in any fashion. Seems kinda odd the gains they made in FEAR.

I do not think the drivers will catch them up all the way.
 
Oooska said:
I've been hearing a lot about HDR and AA not coexisting even for ATI cards. Can you confirm or deny this?

There has been lots of talks on the Elderly forums about this subject. Oblivion uses FP blending HDR, Ati X1K is capable of doing this with FSAA simultaniously. However, till now, the devs don't allow Ati to run such a path.
Hopefully we'll see a patch soon.
 
No HDR and AA together.
You can have Bloom and AA together.

Running at 1600x1200 with HDR is very nice. Of course I wish I could throw 2 or 4x AA on that as well but you know what, I barely notice any "jaggies" with 1600x1200, so for me the HDR is totally worth it.
 
AstroCat said:
No HDR and AA together.
You can have Bloom and AA together.

Running at 1600x1200 with HDR is very nice. Of course I wish I could throw 2 or 4x AA on that as well but you know what, I barely notice any "jaggies" with 1600x1200, so for me the HDR is totally worth it.
So no HDR and AA even with ATI hardware?
 
Just a general question about the game. I'm not really an RPG type, but this game has been hyped so much that I guess I'll have to try it. Do you think that general appeal? The only reason I'm thinking about it is because it's single player. I really don't have time to live in a virtual world. Just something I can play occasionaly and enjoy.

I'd appreciate any thoughts.
 
AstroCat said:
Plus the ATI can use 8xHQAF while the 7900 can't use any forced AF.

What does this mean? What anisotropic filtering mode(s) can't the 7900GTX use, and why...?
 
pibrahim said:
What does this mean? What anisotropic filtering mode(s) can't the 7900GTX use, and why...?

i think you should have asked that question before you went with the nvidia hype and made a choice to purchase your 7900gtx's
 
bobrownik said:
i think you should have asked that question before you went with the nvidia hype and made a choice to purchase your 7900gtx's
that was cold, i think what the reviewer meant was you cant enable AF in the driver panel, that you have to use the max the game allows.
 
pibrahim said:
What does this mean? What anisotropic filtering mode(s) can't the 7900GTX use, and why...?

I'm assuming that its about nV cards being unable to do angle-independant AF (which is ATI's HQ AF right?). This is a hardware-related issue and I sure as hell hope that won't be the case for next gen.

Anyway, nice quick comparison :D.
 
bobrownik said:
i think you should have asked that question before you went with the nvidia hype and made a choice to purchase your 7900gtx's

Don´t be jerk, of course you can force 16AF with the 7900GTX, and the performance hit isn´t that huge.
 
Thanks for the testing! The rigs are so close to each other, that I don´t there will be a 2-3 FPS difference cause of mobo, hdd etc.

This will be heated battle for sure, I mean Oblivion is in the top 10 selling games chart even if the game is not yet out. I seriously hope that Nvidia won´t sacrifice IQ to gain performance.
 
High Quality AF mode set by the drivers. Nvidia does not have a High Quality AF mode like ATI does.

When I set the AF with the 7900 the fps hit was too much, but with the ATI card even in High Quality 8xAF the fps was ok.
But, one thing I did notice was that going from 8x to 16x AF was too much of a hit for me on the ATI card.

The game pushes the cards and right now I am still noticing a 10-15fps average higher on the ATI card over the Nvidia.

And, there is NO AA and HDR on any card, you can not even make that selection.
 
Wha happens if you force AA in the driver (specifically Catalyst) and turn on HDR in game?
 
AstroCat said:
High Quality AF mode set by the drivers. Nvidia does not have a High Quality AF mode like ATI does.

When I set the AF with the 7900 the fps hit was too much, but with the ATI card even in High Quality 8xAF the fps was ok.
But, one thing I did notice was that going from 8x to 16x AF was too much of a hit for me on the ATI card.

The game pushes the cards and right now I am still noticing a 10-15fps average higher on the ATI card over the Nvidia.

And, there is NO AA and HDR on any card, you can not even make that selection.
10-15 fps? How are you making this comparison? Just by looking at it? Or are you using fraps?
 
I guess its kind of makes sense if the is some delta between ATI/NV. Remember the 1900 has a lot more shaders and this game from what I have heard seems to be shader limited....
 
bobrownik said:
i think you should have asked that question before you went with the nvidia hype and made a choice to purchase your 7900gtx's

I'm delighted with my purchase of 7900GTXs but thanks for your "advice".

For everyone else offering to offer more constructive comments - what's the difference between my enabling SLI 16x AF and 'HQ AF'? Is the SLI AF still inferior...?
 
JDAdams said:
Wha happens if you force AA in the driver (specifically Catalyst) and turn on HDR in game?

The game developer must "allow" to use FSAA while you're in the HDR path, you can't always force it. Like Farcry and SS2 etc. the game needs a patch to enable HDR+AA on Ati X1K cards.

For everyone else offering to offer more constructive comments - what's the difference between my enabling SLI 16x AF and 'HQ AF'? Is the SLI AF still inferior...?

You mean the difference between Nv 16xAF and Ati's HQAF? The latter is angle-independant, which means that the whole image gets filtering. Some angles don't get proper filtering with Nv's way.
 
ivzk said:
This game belong in the GITG or TWIMTBP program?
It was developed on Ati hardware and Elder Scrolls III was in the GITG program. You do the math (although Oblivion isn't mentioned on the GITG site).
 
Should also be noted the 7900 GTX in this test was OC'd by factory to 690/1760 and costs $580 on evga's website. How does a stock clocked GTX fare?
 
I got a chance at lunch to play around some more with the 2 systems. Same deal really ATI is faster. I can even out the FPS between them but I have either 4 or 8xAFHQ on the ATI and none on the Nvidia.
That seems to be the major factor is being able to add AFHQ to the ATI card.

I DID turn off Vsync and it did not make any major difference in the results. Still AF with ATI and none with Nvidia to keep the FPS the same.

I'm testing right when you come out of the Sewers and around there.

I'd love to hear some other people results, maybe I'm doing something wrong and can tweak both setups to be better? :)
 
Back
Top