HDR+AA in Oblivion for Ati X1K!

Congrats ATI X1000 owners!!! I'm waiting for some patch action from Bethesda and nvidia where are you... We need drivers!!! :D
 
Pretty playale at 1920x1200, HDR+ 4xAA, 16xHQ AF. Foliage drops the frames pretty far, all other areas are pretty smooth. I suspect 2xAA in Foliage would be smooth enough, at 4x, its not enough for me. Ive also got all the image quality enhancements increased thru the game's .ini. So that puts a pretty large hit on frames too.

[email protected], 2x1900XT's Crossfire, A8R32-MVP.

J-Mag said:
I would like to see an X-Fire X1900 vs SLI 7900gtx comparison now that oblivion is working with X-fire

http://firingsquad.com/hardware/oblivion_high-end_performance/page9.asp

In Foliage 4xAA/8xAF, CF's minimum is higher than SLI's maximum....
 
fallguy said:
In Foliage 4xAA/8xAF, CF's minimum is higher than SLI's maximum....

What exactly is "foliage" anyway? Are they talking about trees, bushes, and harvestable plants, or are they talking about huge fields of randomly generated alpha-blended grass? There's a big difference, and I'd like to know.
 
fallguy said:
In Foliage 4xAA/8xAF, CF's minimum is higher than SLI's maximum....
:eek:
But then again the game is still pretty young, Bethesda still needs to release some patches which fix the strange fps behaviour outdoors on PC.
 
Drexion said:
:eek:
But then again the game is still pretty young, Bethesda still needs to release some patches which fix the strange fps behaviour outdoors on PC.

Nvidia still needs to release some good drivers, too. I didn't really notice any diff between 84.21 and 84.25 on my 7900GTX, I think the 84.25's were mostly for SLI users.
 
Mrwang said:
I'm jealous. Damn you ATI. Anyone want to buy a used 7900GT?

It's known since October'05 (X1K release) that Ati has better image quality options (HQAF + FP HDR&AA) than Nvidia... ;)
 
asmielia said:
Ok, hate to say this as a 7900gt owner, but ATI now officially stomps Nvidia in the latest and greatest game.

If they get better framerate with foliage while maintaining AA on at the same time, that's going to add insult to injury.

Adrian

Just one game ;) I've never figured out the obsession with Morrowwind/Oblivion/etc... couldn't get into those RPG's.
 
lopoetve said:
Just one game ;) I've never figured out the obsession with Morrowwind/Oblivion/etc... couldn't get into those RPG's.


Oblivion players aren't limited to hardcore RPGers... It's like a medieval Deus Ex. I am not a RPG player, hate all things 'orcs and dorks' and I thought morrowind was some kind of MMORPG until after I got Oblivion.
 
Devnull said:
Oblivion players aren't limited to hardcore RPGers... It's like a medieval Deus Ex. I am not a RPG player, hate all things 'orcs and dorks' and I thought morrowind was some kind of MMORPG until after I got Oblivion.

I would happen to agree. I didnt try Morrorwind, WoW, EQ, or ANY other RPG, MMORPG, or any of the other alphabet soup name type games. Main reason is they just were not fast paced enough for me, and I refuse to 'pay to play'. I pay to buy it, and thats all. About as far into RPG's as I like to go, is Zelda... But with all the hype, I tried Oblivion. Its still far from my fav game, but its pretty darn fun. I really like the wide open space, like Farcry has, and Gun does as well. I like to roam. :) It just happened to be the hottest game out right now, has great reviews, and plays/looks much better on ATi hardware. Its ATi's "Doom3" so to speak, but even more so. Saying that, it is only one game, and if you dont like it, then it doesnt matter to you.

Its my opinion that NV's new card will be able to do HDR+AA in the same games ATi can, and have better AF, but as of right now, they dont.
 
Ok am I missing something? Before this patch even came out I forced everything to on, HQ AF, AA all of it. Now I will admit that I had a 1 or 2 crashes and when exiting the game it would go to not responding and have to be forced to end program.

I was running 1024x768 HDR 4xAA HQAF and anything else forced to on in the 3d section of the CCC panel. I was getting about 15-30fps with draw distance on max and grass shadows off.

Computer Specs:

ASUS A8R32-MVP Deluxe bios 311
AMD 64 3000+ 2.0 GHZ Winnie
1GB Mushking Blue DDR 400 1T 2-2-2-4
EPower Tagan 480W-U22
Saphire X1900XT 650/750 w/CCC
2x 100GB Maxtor Sata No Raid
HDA X Mystique DDL Sound
Logitech Z 5500's in FULL DOLBY :)
APC 1000W UPS
 
Kinda makes you wonder what the motivation was behind this statement, and why they didnt include HDR+AA for ATi cards in the game.

Oblivion Dev's said:
On PC, HDR and AA are not supported simultaneously at all, on any video chipset, because of the way we do HDR.

If you absolutely need antialiasing (and the HDR effect looks great without it, frankly), you can switch to the bloom effect.

They suggest you use a NV card to play Oblivion after installing, and its an NV stamped game. The game obviously runs better on ATi hardware, and now thanks to this patch, looks better too. Things that make you go "hmmm"
 
fallguy said:
Kinda makes you wonder what the motivation was behind this statement, and why they didnt include HDR+AA for ATi cards in the game.



They suggest you use a NV card to play Oblivion after installing, and its an NV stamped game. The game obviously runs better on ATi hardware, and now thanks to this patch, looks better too. Things that make you go "hmmm"


This patch just proves Bethesda weren't fully truthful about HDR+AA. They claimed HDR+AA wasn't possible on the PC. Yeah it's not possible on nVidia cards and since Oblivion is a TWIMTBP title, it makes you wonder...
 
Devnull said:
Oblivion players aren't limited to hardcore RPGers... It's like a medieval Deus Ex. I am not a RPG player, hate all things 'orcs and dorks' and I thought morrowind was some kind of MMORPG until after I got Oblivion.

See, I AM a pretty hardcore RPG player, and I still don't like either of them. Yeah, sure, they're pretty...

so? :-p

Oh, they have cool soundtracks too, I guess.
 
fallguy said:
I would happen to agree. I didnt try Morrorwind, WoW, EQ, or ANY other RPG, MMORPG, or any of the other alphabet soup name type games. Main reason is they just were not fast paced enough for me, and I refuse to 'pay to play'. I pay to buy it, and thats all. About as far into RPG's as I like to go, is Zelda... But with all the hype, I tried Oblivion. Its still far from my fav game, but its pretty darn fun. I really like the wide open space, like Farcry has, and Gun does as well. I like to roam. :) It just happened to be the hottest game out right now, has great reviews, and plays/looks much better on ATi hardware. Its ATi's "Doom3" so to speak, but even more so. Saying that, it is only one game, and if you dont like it, then it doesnt matter to you.

Its my opinion that NV's new card will be able to do HDR+AA in the same games ATi can, and have better AF, but as of right now, they dont.

See, I've played everything from Deus Ex and System Shock 2 to Baldur's Gate and all of the Final Fantasys and more...

And I still don't like Morrowwind or any other Bethesda RPG... Just not my cup of tea.
 
I was on the fence about which GPU to buy for this game, between the X1900XT and the 7900. But this settles it, I'm going for the X1900. Now I just need to order it and a VF900-CU and I'm set :D.
 
fallguy said:
Its ATi's "Doom3" so to speak

I know you were just making a general analogy here, but I feel compelled to point out that the performance difference between ATI and nVidia in Oblivion is not even close to being as staggering as it was in Doom 3 when that game was released.

When Doom 3 was new, even the vanilla 6800 standard was beating ATI's top of the line parts. If I recall correctly, it was due to some virtual 32 pixels per clock trick that nVidia was able to do with shadows (and Carmack loving nVidia probably didn't hurt either)

From what I've seen of benches in Oblivion, the X1900's shader power doesn't seem to have anything to do with the better performance, as even the X1800XT is matching the 7900GTX in some of those tests. I would be interested to know what it is about the "foliage" that generates such favorable results for ATI, b/c in the mountain and indoor tests at firing squad, the 7900GTX and X1900 are both offering comparable and comfortable play with a slight edge to the Radeon, whereas in the foliage tests, the X1900 is leading by a pretty wide margin and giving comfortable framerates where the GTX is borderline unplayable.
 
On PC, HDR and AA are not supported simultaneously at all, on any video chipset, because of the way we do HDR.

Joke of the year, obvious influenced by TWIMTBP $$$.
 
fallguy said:
Kinda makes you wonder what the motivation was behind this statement, and why they didnt include HDR+AA for ATi cards in the game.



They suggest you use a NV card to play Oblivion after installing, and its an NV stamped game. The game obviously runs better on ATi hardware, and now thanks to this patch, looks better too. Things that make you go "hmmm"
Yeah, they also said it would look the best on the 360 :rolleyes:
Basically, the statements they made were after heavy consideration of the money they received.
 
2006-04-06 16:15:35 - Oblivion
Frames: 4067 - Time: 127488ms - Avg: 31.901 - Min: 11 - Max: 72



2006-04-06 23:31:17 - Oblivion
Frames: 1350 - Time: 64448ms - Avg: 20.947 - Min: 10 - Max: 62


The first set is HDR, AAx4, HQAFx16 1280x1024.

Second set is HDR, AAAx4, HQAFx16, same rez.

Location: Outside Imperial City, near the first rider you meet, walking around the water and the forest area. All settings at max. No grass shadows, no self shadows.

EDIT: Forgot to mention VSYNC WAS ON. lol
 
"The way it's meant to be played" is clearly without simultaneous AA and HDR, as per the Bethesda statement-shame on you all for not playing it as the developers intended! And a shame that the way it's meant to be played is at sub-par settings-what a strange development philosophy.

And of course I'll be patching my twin X1900s one I get back to my PC, which is currently about 250 miles away. Just got my crappy work-issue Intel/nVidia laptop :-(
 
To those who have installed this already, PRECISELY how do you install this hotfix?

Do you go ahead and install it over the existing 6.3 drivers?
Do you say yes or no to overwriting newer files?

Are there any other tips or tricks?
 
Anyone got a mirror for the drivers? when i load up the ATI page its blank apart from -->

Can't download em
:(
 
arentol said:
To those who have installed this already, PRECISELY how do you install this hotfix?

Do you go ahead and install it over the existing 6.3 drivers?
Do you say yes or no to overwriting newer files?

Are there any other tips or tricks?

I just installed over the cat6.3's said yes to overwriting files.

Windows warns you about them being not approved.

restart windows

set aa in ccc and hdr in oblivion options.

Make sure you dont have cat ai disabled.

Good luck


Edit for restart
 
Have you guys done any testing much too see what the best drivers are right now for ATI and Oblivion? Most everyone using the Cat 6.3's? What about the newest Omega drivers? I'm trying to decide on which to install for best performance/IQ.
 
burningrave101 said:
Have you guys done any testing much too see what the best drivers are right now for ATI and Oblivion? Most everyone using the Cat 6.3's? What about the newest Omega drivers? I'm trying to decide on which to install for best performance/IQ.
Well, hands down, the best drivers right now are the 6.3 Cats with the patch to enable HDR+AA. The visual quality is stunning. :)
 
When Doom 3 was new, even the vanilla 6800 standard was beating ATI's top of the line parts. If I recall correctly, it was due to some virtual 32 pixels per clock trick that nVidia was able to do with shadows (and Carmack loving nVidia probably didn't hurt either)

the game had support for UltraShadow, which was a feature on the 6 series, not to mention Nv hardware has generally had superior OpenGL support, couple features + better support allowed the lower end hardware to really kick ass in that game

From what I've seen of benches in Oblivion, the X1900's shader power doesn't seem to have anything to do with the better performance, as even the X1800XT is matching the 7900GTX in some of those tests. I would be interested to know what it is about the "foliage" that generates such favorable results for ATI, b/c in the mountain and indoor tests at firing squad, the 7900GTX and X1900 are both offering comparable and comfortable play with a slight edge to the Radeon, whereas in the foliage tests, the X1900 is leading by a pretty wide margin and giving comfortable framerates where the GTX is borderline unplayable.

the shader power is definetly having a huge difference as a single X1900 is coming damn close to SLI'd GTX set ups, what most likely helps with foliage is the more efficient hardware it runs, their new cache system and mem bus is most likely the reason this card kicks foliages ass in oblivion

I dont believe the patch addresses anything outside of HDR + AA and AFR mode for CrossFire in Oblivion.

thats all i believe it was, someone had a bunch of code they thought would make AA work with HDR and also added a profile for CF
 
If I recall correctly, it was due to some virtual 32 pixels per clock trick that nVidia was able to do with shadows (and Carmack loving nVidia probably didn't hurt either)

I think you guys say this stuff just to piss me off. Well, let me just say that it works.
 
phide said:
I think you guys say this stuff just to piss me off. Well, let me just say that it works.

Sorry to piss you off with my general inaccuracy. Say whatever you want, it won't hurt my feelings. I just didn't feel like actually looking up the specific names and specs for the feature I was talking about, since it doesn't really matter that much anymore (at least not to me, I haven't touched Doom 3 since I stopped playing it about 2/3 of the way through)
 
Was really just in reference to the John Carmack comment. My apology - should have clarified this.

I understand that this is off topic, but I've posed the following question to others before with little or no response: Why do you believe John Carmack has some sort of pro-nVidia bias?
 
I don't believe he has any bias twords them, i think he likes them just because of their superior OpenGL performance, and thats what he works with, no personal bias, just go with whats the best with what you work with
 
phide said:
Was really just in reference to the John Carmack comment. My apology - should have clarified this.

I understand that this is off topic, but I've posed the following question to others before with little or no response: Why do you believe John Carmack has some sort of pro-nVidia
bias?

I thought it was just commonly accepted knowledge that he preferred to optimize for nvidia. Whether it's true or not, bias towards one team or the other seems to be the standard these days. Gotta pick your card based on the games you most want to play. Yes both sides play any game relatively well, but you better believe I'm still kicking myself in the butt for getting a 7900GTX when an X1900 has a pretty clear advantage in my game of the moment, especially with this HDR+AA patch out :(

Trimlock said:
I don't believe he has any bias twords them, i think he likes them just because of their superior OpenGL performance, and thats what he works with, no personal bias, just go with whats the best with what you work with

^^what he said.
 
phide said:
It's a shame that Oblivion likes to 'cut' my X1900 off, resulting in nearly constant VPU recovers. It's also a shame that Bethesda tech support and ATi tech support simply don't care.

desrin said:
because it is your problem not theirs. if a zillion other people can run with x1900's and you can't you shouldn't blame them. how it could be bethesda's or ati's fault when they know the x1900 video cards are compatible with Oblivion makes no sense.

I wouldn't care either.

I'm having this exact same problem. I've been unable to locate any fixes for it yet and am hesitant to play because I don't want this game damaging my video card or monitor (more likely) because it's losing signal so often and restarting.

Desrin appears to like making blind comments. Not many people both own these cards and this game that post on these boards, so stop making generalized statements.
 
Hartz said:
Why are there still no screenshots showing it off?

obvc0xaa.jpg


obvc4xaa.jpg




obva0xaa.jpg


obva4xaa.jpg
 
Back
Top