Oblivion Video Card Benchmarks

5150Joker said:
If this was an apples to apples review, the XTX would've stomped the GTX since it was using higher quality settings like HQ AF (nVidia doesn't even have an equivalent) and grass shadows. Bit-tech noted that the GTX gained 5 fps for it's min fps testing by just shutting off grass shadows. That said, the XTX made an impressive show: it beat the GTX even after using higher quality settings. Looks like ATi was wise to go with a shader heavy architecture.
ok rollo....
 
Quick question, is bloom a type of HDR? I will be getting a 7900GT on Monday(should of came this week, but Fedex was lazy and it sat in the same spot for a day and a half!!), and since I will be playing at a pitiful 1024x768 res, I will be cranking up alot of settings and wondering if Bloom plus AA would work? At this low of a res, AA is a must and wondering if I still can get some bloom effects.
 
Bloom and AA work at the same time in Oblivion. Also, you can tweak the bloom settings in the .ini file to look extremely similar to HDR.

If you look at the minimum framerates for the high end cards you will see the big fault of these games. I suspect it's the engine that Oblivion is built on. I bet if you take the two high end cards from Nvidia and ATI and test them from 800 X 600 through 2560 X 1600, you will see that the frame rates will not be impacted nearly as much as other games.
 
5150Joker said:
Looks like ATi was wise to go with a shader heavy architecture.

Heh, OT but brings up a qoute from ATI's recent conference call..ATI ceo says "honestly believe we shocked competition with R580"

I do believe that..refresh products have never seen such a large influx of muscle as with R580.
 
5150Joker said:
If this was an apples to apples review, the XTX would've stomped the GTX since it was using higher quality settings like HQ AF (nVidia doesn't even have an equivalent) and grass shadows. Bit-tech noted that the GTX gained 5 fps for it's min fps testing by just shutting off grass shadows. That said, the XTX made an impressive show: it beat the GTX even after using higher quality settings. Looks like ATi was wise to go with a shader heavy architecture.
They also said that IQ difference is minimal...
 
if I get another 6800gtoc and sli it. wont that allow me to run it at 1600x1200 instead of 1024x768?
 
bLaCktIGErs91 said:
B ecause you don't have a card that's not yrt invented ;)

It's a hardware impossiblity, on ANY card :(

Then why is the XBox 360 version capable of having HDR + AA on at the same time?
 
AppaYipYip said:
Then why is the XBox 360 version capable of having HDR + AA on at the same time?


X360 does AA using logic built into the EDRAM daughter die on the gpu package (why 360 has "free" AA). X360s do HDR on the gpu core (obviously), so it is possible that the separation of processes is what enabled HDR+AA on the console.
 
AppaYipYip said:
Then why is the XBox 360 version capable of having HDR + AA on at the same time?
Because the XBox 360 has an AA compatible FP10 mode.
 
aop said:
They also said that IQ difference is minimal...

Yes, the difference is minimal. What does that mean? If you were to run the cards at the same settings, the XTX would have an even bigger advantage. Turning off HQ AF, and setting the game settings to the same, would be a rather large increase for the XTX. If there is no advantage of higher IQ, why run it at a higher IQ setting? I know I wouldnt. I would like to see some comparison shots. I imagine HQ AF would give a noticable difference for the XTX.

Im all for the "best playable" settings, but I also like to see apple to apples as well.
 
fallguy said:
Yes, the difference is minimal. What does that mean? If you were to run the cards at the same settings, the XTX would have an even bigger advantage. Turning off HQ AF, and setting the game settings to the same, would be a rather large increase for the XTX. If there is no advantage of higher IQ, why run it at a higher IQ setting? I know I wouldnt. I would like to see some comparison shots. I imagine HQ AF would give a noticable difference for the XTX.

Im all for the "best playable" settings, but I also like to see apple to apples as well.

HQ AF doesn't make a difference in that game, for whatever reason. perhaps it could be a bug (either in the game or the driver) and it's not working at all? there's certainly no visual clues whether it's working or not (i can certainly provide screenshots if you want).

while i agree i would like to see "apples to apples" settings, making an assumption the XTX would get "large gains" by turning off HQ AF is speculation at this point. could be it's not working and there would be no gain whatsoever.

the other consideration here is the game in question - oblivion, as well as the test methodology. since oblivion use "dynamic rendering", what control methods are in place to ensure that the scenes rendered are identical on both cards? oblivion also uses different shader paths for each of the cards (2.0a on nv, 2.0b on ati). as such, can there even be an "apples to apples" comparison?

perhaps brent's review of oblivion migh shed some light on this...
 
CaiNaM said:
HQ AF doesn't make a difference in that game, for whatever reason. perhaps it could be a bug (either in the game or the driver) and it's not working at all? there's certainly no visual clues whether it's working or not (i can certainly provide screenshots if you want).

while i agree i would like to see "apples to apples" settings, making an assumption the XTX would get "large gains" by turning off HQ AF is speculation at this point. could be it's not working and there would be no gain whatsoever.

the other consideration here is the game in question - oblivion, as well as the test methodology. since oblivion use "dynamic rendering", what control methods are in place to ensure that the scenes rendered are identical on both cards? oblivion also uses different shader paths for each of the cards (2.0a on nv, 2.0b on ati). as such, can there even be an "apples to apples" comparison?

perhaps brent's review of oblivion migh shed some light on this...

I wasnt aware it wasnt working. I dont know why it wouldnt. I have 2xX1900's to try if I wanted. But I dont play rpg's.. although with all the hype about this game, I may try it.

The "large gains" part of my post, was relating to HQ AF and that the game settings were higher on the XTX. Making everything even, driver and game settings would net the XTX large gains. As the GTX got a huge boost from 12fps to 17fps min. They didnt comment on the average frame gain.

You can manually edit the .ini to do the same path, even 3 if you want.

I too am waiting for Brents review, they normally do more IQ shots. FS will more than likely do a whole article (or two) for the game too. Look forward to that as well.
 
Bit tech's recommended playable settings may seem low, but that's to maintain pretty smooth fps. The difference between their recommended settings for the 7800GT OC and my usual running configuration, 1280x960 HDR, mostly max expect for grass shadows etc..
was significant. I usually get around 25-35 fps outside.

On the reduced settings that shot up to almost always 40-50 fps and the stutters became way less frequent. Distant land makes more of a difference than the resolution, but with it off, you get that fog effect in the distance.
 
ManicOne said:
The X1900 bench is spot on. Turning all options to max would make running 1600/1200 a little bit framey.

Watching my 1900 drop below 20 fps (rarely thankfully) at 1280 is kinda hard to believe possible....


Yeah, it kicks my XTs but at 1680*1050. So I go 2xAA and Bloom to get the kind of frames I want. I'm getting 2 - 7900GT Extreme Editions on some Dell deal instead so as not to not blow too much more money, switch boards, or any of that...

Hope it works out good. Now that I been using an ATi x1900 card for a bit, I'm going to probably have to shut off the opts. I saw the shimmering with my older 7800GTXs but it didn't bother me that much. I certainly appreciated it more when it was gone though.

Hey Brent or anyone, how is SLI'd GTs with HDR? Mainly, I'm hoping 1680*1050 will be smooth. My widescreen is actually a CRT though, if I could push 1920*1200 that would be the tits. I know you been testing out SLI and Oblivion at 1920*1200 Brent. :D
 
Comments:

- On my X1600XT, turning off HQ AF is a *huge* performance increase. No visual difference at all, though. Indeed, while you can clearly notice the difference between 0xAF and 4xAF, from 4xAF to 8xAF just doesn't show much at all. Can't imagine why you'd go higher. 4xAF without HQ AF is all you really need, spend your FPS on the other settings!

- HDR looks terrific in this game. It's not overdone like some other titles - it's very subtle, and just makes the game world feel more "real" than without. Bloom....eh, I'd like to see the .ini tweaks to make THAT work "like HDR". By default, they don't even begin to compare. Just watch a sunset over White Gold Tower looking down on the Imperial City with HDR on and then with bloom on (even with FSAA). One is breathtaking, the other is "meh, just another PC game".

I mean, I'm not saying it CAN'T be done - I've certainly read nothing about tweaking the .ini file to make the bloom effect look better, but "stock", the two are just MILES apart. I certainly would *like* it to be done, as even 2xAA in this game looks VERY nice. It's just that HDR looks WAY better than bloom, so....
 
pxc said:
The problem is the framebuffer format bethesda used. No current PC video card, including ATI X1xxx cards, supports AA in that mode.

So which framebuffer format did they use? Afaik Nvidia doesn't do "custom" formats on floating point blending, only FP16. Ati does FP16HDR with FSAA.

I think it's more of a matter that Oblivion is currently a "hot" game, and a Nv's "The way it's ment to be played" one (the PC version!).
Nvidia simply doesn't allow a HDR+AA path for Ati X1K cards because it would make their cards look like obsolete.
 
texuspete00 said:
Hey Brent or anyone, how is SLI'd GTs with HDR? Mainly, I'm hoping 1680*1050 will be smooth. My widescreen is actually a CRT though, if I could push 1920*1200 that would be the tits. I know you been testing out SLI and Oblivion at 1920*1200 Brent. :D

Dunno about GTs but with my GTXs I prefer the bloom and 4xAA over the HDR. I run it at 16x12.
 
pxc said:
I didn't say it did. :p I know HDR in Oblivion requires SM3 and the R4xx processors lack it.

There is no SM3 support in Oblivion and the only way any HDR format would require a specific shader model would be because the developer arbitrarily tied the feature to a shader profile in their game. HDR and DirectX shader models have nothing to do with each other.
 
John Reynolds said:
There is no SM3 support in Oblivion
Change the "bAllow30Shaders=0" line in oblivion.ini to 1 to enable SM3. I've only seen miniscule speed improvements from enabling that option.

But it's correct that SM3 is not required for HDR. Sorry about that.
 
pxc said:
Change the "bAllow30Shaders=0" line in oblivion.ini to 1 to enable SM3. I've only seen miniscule speed improvements from enabling that option.

But it's correct that SM3 is not required for HDR. Sorry about that.

I don't think that changes the vs/ps render target which are still 2.0a for nVidia and 2.0b for ATi cards.
 
Back
Top