Hdr + Aa

Status
Not open for further replies.
5150Joker said:
I have no render issues, no crash problems or any other issues with the "Chuck" drivers. Where'd you get this from?

?????

You asked me to check out the patch the other day, so I followed the link in that Chuck thread on AT you linked me to?

http://support.ati.com/ics/support/default.asp?deptID=894&task=knowledge&questionID=21960

This driver is provided as a proof-of-concept and is not supported by Bethesda, 2K games or ATI Technologies.





Known issues:

1. SuperAA with CrossFire in HDR mode is not enabled. If you have SupperAA enabled you will get the quality and performance of one card as opposed to two.

2. Rendering issues with grass shadows

3. On an ATI Radeon X1600XT CrossFire configuration, the game may intermittently crash if the resolution is set higher than 1600x1200

4. On an ATI Radeon X1900 series card, the game may crash while task-switching between the desktop and game (alt-tab).
 
R1ckCa1n said:
This could be because one in ten thousand users experienced the problem.........

both companies have issues with drivers, just some tend to release them more often. That is completely off topic from the OP.
i sincerely doubt that a "1 in 10,000" issue would make it as a "known issue" in the release notes of a driver, beta or not.
 
SnipingWaste said:
Were are the release notes for the 84.25 beta? I can't find them anywere. Is there a release note for 84.25 beta and if there is what do they say about Oblivion?

On this nZone page:
http://www.nzone.com/object/nzone_downloads_winxp_2k_32bit_84.25.html

There is a link to(PDF format):
Release notes

Issues Resolved in Version 84.25
The following are changes made and issues resolved since driver version 84.21:
• General compatibility fixes

Dosn't say much...
I quickly scolled through, but couldn't find anything about Oblivion at first glance...

Terra...
 
i sincerely doubt that a "1 in 10,000" issue would make it as a "known issue" in the release notes of a driver, beta or not.

you don't know ATi very well, back when i had a rage fury playing EQ, the latest driver release said "EQ: Might not render tree's," and trust me, i never had any tree rendering problems..
 
Trimlock said:
you don't know ATi very well, back when i had a rage fury playing EQ, the latest driver release said "EQ: Might not render tree's," and trust me, i never had any tree rendering problems..

How do you know? :p
If it didn't render the tree you don't see the "missing" tree ;)

Terra...
 
CaiNaM said:
i don't believe any wondering is required; partnering in nvidia's TWIMTBP program is designed for nvidia to sell more cards (just as the whole "get in the game" partnership between ati and valve was meant to sell more ati cards). it's also possible ati knows their hardware better than bethesda :)

Do you think Bethesda intentionally left out HDR+AA for ATi users, and knowingly lied that it simply couldnt work? With the evidence that is out there, that sure it how it looks. And if true, that is very pathetic.
 
fallguy said:
Do you think Bethesda intentionally left out HDR+AA for ATi users, and knowingly lied that it simply couldnt work? With the evidence that is out there, that sure it how it looks. And if true, that is very pathetic.
i'd hate to be one who believes in "consipiracy theories", but it's certainly not beyond the realm of possibility. more likely tho that they were simply ignorant. they probably believed it was not compatible how they designed their rendering, but an ati employee who knew the hardware better figured out a workaround.

could certianly be either/or tho.
 
So you think they had no problem enabling HDR+AA on the Xbox but it was too complicated for them enabling it on X1K? C'mon :)
 
fallguy said:
Do you think Bethesda intentionally left out HDR+AA for ATi users, and knowingly lied that it simply couldnt work? With the evidence that is out there, that sure it how it looks. And if true, that is very pathetic.

They probably didn't know how to make AA and HDR work together like that most likely. I dont see a real big reason for them to openly lie about it. Even though Oblivion may be an nVidia game on the PC it was also released for the 360 and the 360 runs ATI hardware.
 
i think Bethesda knew that HDR+AA could be enabled the whole time. It's just there deal with Nvidia that made them lie about it.
 
There is no "deal". Many of you are living in a fantasy world.

There are people that still believe that the U.S. never landed on the Moon, even when the evidence is abundantly clear that we did. Why? There exist strange mental complexes that people like myself just don't understand. To those those Moon landing theorists, facts are fabricated in their minds. To the rest of the billions on planet Earth, facts have to come from somewhere else a bit more respectable.

Bethesda made a decision based on variables that we know nothing about. We can speculate, but we should never word our speculations as fact or otherwise intend them to be viewed as such.
 
phide said:
There is no "deal". Many of you are living in a fantasy world.

There are people that still believe that the U.S. never landed on the Moon, even when the evidence is abundantly clear that we did. Why? There exist strange mental complexes that people like myself just don't understand. To those those Moon landing theorists, facts are fabricated in their minds. To the rest of the billions on planet Earth, facts have to come from somewhere else a bit more respectable.

Bethesda made a decision based on variables that we know nothing about. We can speculate, but we should never word our speculations as fact or otherwise intend them to be viewed as such.

QFT...

Terra - BTW Badastonomy.com rules ;)
 
Trimlock said:
haha i'd be doing a forward moon walk

Only if you hit the invisible tree ;)
Tree's don't usually grow on roads/paths or in caves either ;)

Terra - All GFX cards have issues...
 
It's a good thing to have - I mean why not? Have your cake and eat it too. :) But I find it really lame how Ati fans boast this to get people to go Ati. It's barely useable and when it is, it's crappy imho. It's a technology that, much like anti aliasing, will be perfected eventually and all card makers will use it and it'll run great. For now, it's like back in 2000 when Geforce 2 GTS owners would boast about using aa. Not completely necessary yet, but a cool feature nonetheless. ;)

Oblivion is one game...and a game I certainly wouldn't touch, so meh.
 
It is definitely not a useless feature, it is very playable on my X1900XT HDR and AA. Saying that it is a useless feature especially when someone hasn't tried it and is just trying to downplay something is just arrogant IMO.
 
Rollo said:
?????

You asked me to check out the patch the other day, so I followed the link in that Chuck thread on AT you linked me to?

http://support.ati.com/ics/support/default.asp?deptID=894&task=knowledge&questionID=21960


Well the game isn't stable with any card when alt-tabbing and yes it's true there are grass shadow flickering issues and according to some CF problems as well that should be sorted out in 6.4. That doesn't mean the drivers are nearly as unstable as you made it sound in your other post though - FYI they are rock solid, especially for beta drivers. Just look at the 84.25 driver set for nVidia cards, users are complaining of reduced IQ and lower OGL performance.
 
Particleman said:
It is definitely not a useless feature, it is very playable on my X1900XT HDR and AA. Saying that it is a useless feature especially when someone hasn't tried it and is just trying to downplay something is just arrogant IMO.


Bad choice of words. I guess I meant "not completely necessary but nice".

:p
 
Interview with "Chuck".

http://www.hexus.net/content/item.php?item=5282

Pretty pathetic what Bethesda did, but I guess thats to be expected sometimes.

main part;

Chuck @ ATI Here are the gory details. When you force on MSAA through the Catalyst Control Center, our driver has a special "ForceAA" path that will allocate an MSAA buffer and bind it to the flip chain (back buffer). That causes all rendering to the flip chain, to actually go into the MSAA buffer, so you get higher quality. This doesn't work for Oblivion, because most of the rendering doesn't go into the flip chain, so you don't see any difference. In my patch, I made a special ForceAA path to enable MSAA on textures instead of the flip chain.

Currently, this path is only enabled for Oblivion under Catalyst AI app detection. The exciting thing is that this ForceAA path could be enabled for other games in the future, like FarCry and Splinter Cell, so these games could get HDR+AA also! In my special ForceAA path, I detect when the correct FP16 renderable texture is created and then allocate a separate FP16 MSAA buffer and bind it to the texture. From that point on, all rendering the app does to the FP16 texture actually goes into the FP16 MSAA buffer instead.

When the app is done rendering into the FP16 buffer and attempts to bind it as a source texture, I tell the hardware to combine the multiple samples from the MSAA buffer and put the anti-aliased image into the FP16 texture that the app created. The app then reads from the texture as usual. All of the data is full-quality FP16 from start to finish. We don't do any shortcuts or swap formats behind the app's back.

(Note: When you hit TAB to equip weapons, etc., that scene is not rendered into the FP16 surface, so it will not get AA even with the patch. You can see how aliased your character looks and compare that to how good the game looks.)

Chuck @ ATI On Friday, I spent a few hours studying the game's rendering path. Over the weekend I thought about how I could design an HDR+AA path for Oblivion without breaking other programs. On Monday, I wrote all the code and started testing it. I'd say about 12 hours total for designing and coding

First Bethesda said it couldnt be done. Then they said it would take far too long, and asked if we wanted the game this year? Now its come out that it took 12 hours of programming? *sigh*
 
So which of these scenarios could it be:

1. MS paid Bethesda a good chunk of change not to add in HDR+AA for the PC version.
2. nVidia had a hand in Bethesda not adding in HDR+AA.
3. Bethesda are incompetant and lazy so they didn't want to take the time in to add HDR+AA.
 
Nice to read that his patch can force HDR+AA in all (future) FP HDR titles which don't have AA support by the game dev. :cool:
 
MFZ said:
So which of these scenarios could it be:

1. MS paid Bethesda a good chunk of change not to add in HDR+AA for the PC version.
2. nVidia had a hand in Bethesda not adding in HDR+AA.
3. Bethesda are incompetant and lazy so they didn't want to take the time in to add HDR+AA.

Sadly, my vote is number two. You cant put a "The way its meant to be played" on the side of a box and then have it look and run better on your competitors hardware, but thats exactly what happened.
 
You still fail to grasp the concept of the TWIMTBP program after I've already explained it.

It does not ever indicate what you may think it may indicate based on the name. It's PURELY a way for nVidia and Bethesda to collaborate on increasing the Good Things and decreasing the Bad Things. If you want to be one of those anti-Moon landing guys, go ahead - just don't expect anyone to weigh too heavily on your opinions that you choose to word as fact.

I know, it's a horrible name. That's not the point. A Lamborghini is a bull (I guess), but their cars aren't actually bulls, as is plainly obvious by looking at them.
 
phide said:
You still fail to grasp the concept of the TWIMTBP program after I've already explained it.

It does not ever indicate what you may think it may indicate based on the name. It's PURELY a way for nVidia and Bethesda to collaborate on increasing the Good Things and decreasing the Bad Things. If you want to be one of those anti-Moon landing guys, go ahead - just don't expect anyone to weigh too heavily on your opinions that you choose to word as fact.

I know, it's a horrible name. That's not the point. A Lamborghini is a bull (I guess), but their cars aren't actually bulls, as is plainly obvious by looking at them.
I thought TWIMTBP meant the developer got cold hard cash and then would "make" the game run better on their hardware.
 
phide said:
You still fail to grasp the concept of the TWIMTBP program after I've already explained it.

It does not ever indicate what you may think it may indicate based on the name. It's PURELY a way for nVidia and Bethesda to collaborate on increasing the Good Things and decreasing the Bad Things. If you want to be one of those anti-Moon landing guys, go ahead - just don't expect anyone to weigh too heavily on your opinions that you choose to word as fact.

I know, it's a horrible name. That's not the point. A Lamborghini is a bull (I guess), but their cars aren't actually bulls, as is plainly obvious by looking at them.

What the hell are you talking about? "Increasing the good things and decreasing the bad"? What things are these? Nvidia pays Bethesda money and in turn gets to put their logo on the box. Its as simple as that.
 
Mrwang said:
Sadly, my vote is number two. You cant put a "The way its meant to be played" on the side of a box and then have it look and run better on your competitors hardware, but thats exactly what happened.

I don't buy that. Personally I swing with what Rys said about it being an XBox360 thing, just my personal opinion.
 
MFZ said:
So which of these scenarios could it be:

1. MS paid Bethesda a good chunk of change not to add in HDR+AA for the PC version.
2. nVidia had a hand in Bethesda not adding in HDR+AA.
3. Bethesda are incompetant and lazy so they didn't want to take the time in to add HDR+AA.

The question is, will the Chuck patch sell less Xbox copies? Don't know. Will the Chuck patch sell less Nvidia cards (and more Ati)? I think yes.
 
MFZ said:
So which of these scenarios could it be:

1. MS paid Bethesda a good chunk of change not to add in HDR+AA for the PC version.
2. nVidia had a hand in Bethesda not adding in HDR+AA.
3. Bethesda are incompetant and lazy so they didn't want to take the time in to add HDR+AA.


Sadly none of those even fit the picture, Bethseda is partners with ATi, and if you remember Morrowind was one of ATi's games. Bethseda wanted to keep the code cold on all platforms, so changing it on one platform will have effects on another. this would have taken a bit of time if they were to introduce the patch into thier program, which then they would have had to modify thier paths in xbox version too. Good example is Far Cry, thier HDR+AA patch (which isn't out yet but should have been) took a good month or two to get it in.
 
Apple740 said:
Oblivion is on Nv's TWIMTBP list.


Bethsedha as a company works with both nV and ATi's programs, just as Crytek, and many other developers. Take the Cry Engine, the version that is being used in Xbox 360, Cry Engine 1, is capable of HDR+AA on the xbox, it isn't able to do this on the pc version with out a patch. Well same situation here.
 
Yeah, but as a TWIMTBP partner in Oblivion Nvidia had a big finger
in the soup i think. The Xbox version stands apart from this, they have no influence there.

Well, we'll never know what really happened i think.
 
Coolmanluke said:
Nvidia pays Bethesda money and in turn gets to put their logo on the box. Its as simple as that.

Not true at all. There's widespread information available on the TWIMTBP program, and it's clear you haven't been reading any of it.

Good Things are performance enhancing optimizations. Bad Things are issues with stability or rendering that are directly related to the renderer, nVidia drivers or nVidia hardware. This all falls under the TWIMTBP program.

I don't think the Chuck patch will have any impact on ATi sales, but I could be wrong.
 
Apple740 said:
Yeah, but as a TWIMTBP partner in Oblivion Nvidia had a big finger
in the soup i think. The Xbox version stands apart from this, they have no influence there.

Well, we'll never know what really happened i think.

Both engines if I'm not mistaken are compiled at the same time, well can be compiled at the same time. Cry engine was built this way, and I'm pretty sure the Gamebyro and many other engines are built this way, just saves time in the long run. Your shader lists are seperate from your API dll's, so lets say you want to modify the shader lists, you have to make approrate changes, in your game.dll, your game.exe, your api.dll, and so and so forth. I don't know Bethseda's code, but it all depends on how they programmed thier game and engine.dll's, it could have been quite difficult for them to add a feature like this in. Because it goes at the base of the engine core. Another example, I can't tell you the specifics of the shader or the shader itself, but we wanted to add a specific effect into the Cry Engine, we had to do some conciderable rework to get MRT's into the Cry Engine. Very simple shader too, many other engines have this shader but the Cry Engine did it in a different way, which our level designers didn't like, but as simple as it is, took us 2 months if not more, could have been more, don't remember exactly.

I don't think they did, Question of it being usuable on one ATi product and not the other, with similiarity with other engines having to go through the same thing (and with the Cry Engine when they first introduced HDR into thier engine there was no MSAA for HDR hardware avialable, added to this, a new path had to made specifically for ATi cards not just a some product id lists that had to be changed). Too many coinicidences to be just pointing out game programs on this one, specially since at the time of Far Cry's HDR there was no such thing as MSAA for HDR.
 
phide said:
I don't think the Chuck patch will have any impact on ATi sales, but I could be wrong.

It would appear so. Ive seen several posts such as this, and even about taking their Xbox version back, because the reason they got it, was the better IQ. Now that is no longer the case.

Im not suggesting this is going to make people stop buying 7900GT's, far from it. But the X1800XT and 7900GT are pretty even in most benches. There are posts all the time about which one to get. Oblivion is a HUGE game right now. HDR+AA may be enough to tip some sales to the X1800XT alone, not to mention the XT is already a lot faster than the GT overall in Onlibion. Faster with better IQ? Yeah, Id say it will impact at least a few sales.
 
fallguy said:
Interview with "Chuck".
....
First Bethesda said it couldnt be done. Then they said it would take far too long, and asked if we wanted the game this year? Now its come out that it took 12 hours of programming? *sigh*

To be fair, you'll notice from the "Chuck" interview, he is basically COMPLETELY changing how the driver handles the rendering in the game, presenting an option that was *not* accessible before.

For all we know, doing it "the old way" really would have taken months for Bethesda to get the code working in. It's just that someone from ATI's driver team decided to simply bump around render targets on the game and create a way to blend them together.

It's not like Bethesda can get FSAA+HDR working in their game by changing ATI's drivers for them, they have to change game code! IE., it's possible Bethesda method of FSAA+HDR requires months of game code change, or ATI's method of FSAA+HDR requires 12 hours of driver code change.

See what I'm saying? Not necessarily a conspiracy theory. The only real issue is when Bethesda noticed this problem, why didn't they just ask ATI what their options were to get it working ASAP? I think we can attribute that lack of diligence to being in the "TWIMTBP" program....but no more than that. It wasn't really malicious - they just didn't have the time to implement it in their game code, and ATI wasn't paying them to care about working with their driver team (to find alternatives) like nVidia was (to improve performance).
 
ShuttleLuv said:
It's a good thing to have - I mean why not? Have your cake and eat it too. :) But I find it really lame how Ati fans boast this to get people to go Ati. It's barely useable and when it is, it's crappy imho. It's a technology that, much like anti aliasing, will be perfected eventually and all card makers will use it and it'll run great. For now, it's like back in 2000 when Geforce 2 GTS owners would boast about using aa. Not completely necessary yet, but a cool feature nonetheless. ;)

Oblivion is one game...and a game I certainly wouldn't touch, so meh.

After I used this patch I would never be able to go back, even just 2x AA makes the game look alot cleaner and still runs perfectly smooth. For me it is much more than a nice to have option. But it is true that is is useless for anything less than a 1900xt will not be able to run this.
 
ludachaz said:
After I used this patch I would never be able to go back, even just 2x AA makes the game look alot cleaner and still runs perfectly smooth. For me it is much more than a nice to have option. But it is true that is is useless for anything less than a 1900xt will not be able to run this.
No, that's not true. Running 4xAA+HDR at 800x600 with an X1600XT. Looks a LOT better than 0xAA+HDR at 1280x1024, and performs a LOT better, too.

(Seriously, it's a console port, remember? It's amazing how great the game still looks at 800x600.)
 
dderidex said:
To be fair, you'll notice from the "Chuck" interview, he is basically COMPLETELY changing how the driver handles the rendering in the game, presenting an option that was *not* accessible before.

For all we know, doing it "the old way" really would have taken months for Bethesda to get the code working in. It's just that someone from ATI's driver team decided to simply bump around render targets on the game and create a way to blend them together.

It's not like Bethesda can get FSAA+HDR working in their game by changing ATI's drivers for them, they have to change game code! IE., it's possible Bethesda method of FSAA+HDR requires months of game code change, or ATI's method of FSAA+HDR requires 12 hours of driver code change.

See what I'm saying? Not necessarily a conspiracy theory. The only real issue is when Bethesda noticed this problem, why didn't they just ask ATI what their options were to get it working ASAP? I think we can attribute that lack of diligence to being in the "TWIMTBP" program....but no more than that. It wasn't really malicious - they just didn't have the time to implement it in their game code, and ATI wasn't paying them to care about working with their driver team (to find alternatives) like nVidia was (to improve performance).

I see what you are saying, but from reading the Chuck interview, it seems like Bethesda chose to not implement HDR+AA for a different reason, not because it was too hard/long to implement.

Chuck said:
There is a slight catch though. The DX9 API allows apps to create FP16 MSAA surfaces, through CreateRenderTarget(), but they can't be directly used as textures. So while an app that doesn't support FP16 MSAA might create a FP16 texture, render to it, and then use it as a texture, an app that supports FP16 MSAA needs to create a FP16 texture and a FP16 MSAA render target, render into the MSAA buffer, copy the MSAA buffer into the texture, and then use the texture. Now this may sound like a big difference, but it's really not that big of a change. (When you think of how complicated the Oblivion engine is and how many hundred of surfaces they use, adding this shouldn't be hard.)

Hexus said:
It also makes you wonder why Beth have been hesitant to add it in, allocating the MSAA-able rendertarget and rendering into that, before copying (StretchRect after a global CheckDeviceFormatConversion maybe, to detect support by the driver) to copy to the bound texture before further sampling.

I think it would be interesting if some would ask Bethesda why they really didn't add HDR+AA since it seems like it was an easy fix. Oh well, guess we will never know. :(
 
Status
Not open for further replies.
Back
Top