GeForce 185.20 Released - Now With Ambient Occlusion

I'm running dual GTX 280s in SLi and I cranked AO on high in Left 4 dead, it was a massive framerate hit.

on low it's not noticable at all, but does offer some nice improvments. on medium or high I can notice the framerate drop

That sounds about right. AO scales with the resolution at which it is applied. If you game at a high resolution, say 1920x, and AO at 'high' means that it applies it at that resolution, it'd have much more of an effect than going down a few notches. Half the resolution AO is applied to means 1/4th of the processing power needed, roughly.
 
I don't really get this whole Ambient Occlusion thing.
Now I have toyed around with this effect myself... and the way I did it was to precalculate the average occlusion for every vertex by raycasting the hell out of the model, and storing it as one of the vertex' parameters.
In other words, it requires a lot of precalculation at the content design phase, aside from specific vertexshader support.

So why is this a standard feature? As far as I know, this can only work with games that already have it precalced in their content, and nVidia performing shader replacement.
In other words, quite weird.

Edit: Ah, I suppose they are doing a screenspace derivative. In which case you won't need preprocessed model data... but they'd still need to replace the shaders.
I just don't think it's an option that a driver should provide.
 
They don't need to replace anything necessarily. But they could "inject" their own SSAO shader.
 
They don't need to replace anything necessarily. But they could "inject" their own SSAO shader.

That requires knowledge of the rendering algorithm of the application though.
You need to access the depth buffer at the 'right time', which depends very much on how the renderer works.
Sure, it might work most of the time if you just assume that the depth buffer is valid at the end of all frame rendering... But still, it's just a big hack and hardly foolproof. Aside from that, you want to affect only the ambient component... you don't want to be 'too late' and dim diffuse and specular components aswell, that'd look wrong.
Aside from that, it's quite hard to have any kind of control over quality, since the depth range will vary from application to application.
 
There's always been a problem with World of Warcraft where if you play the game in window mode with SLI enabled, your fps would be cut in half. That no longer happens for me with these drivers.
 
That requires knowledge of the rendering algorithm of the application though.
You need to access the depth buffer at the 'right time', which depends very much on how the renderer works.
Sure, it might work most of the time if you just assume that the depth buffer is valid at the end of all frame rendering... But still, it's just a big hack and hardly foolproof. Aside from that, you want to affect only the ambient component... you don't want to be 'too late' and dim diffuse and specular components aswell, that'd look wrong.
Aside from that, it's quite hard to have any kind of control over quality, since the depth range will vary from application to application.

That is why I suspect that nvidia is replacing or removing the AO shader of specific games, which is why it has only been showcased for games that already have AO. Then it makes sense. "Hey, the new drivers are a huge improvement in FPS!" Oh wait, AO defaults to off so IQ took a major hit, oh well - its FASTER!
 
We shouldn't have to guess about that. Somebody with Crysis can just set AO to off in the CP and set ingame settings to High or Very High and see if AO is disabled.
 
That is why I suspect that nvidia is replacing or removing the AO shader of specific games, which is why it has only been showcased for games that already have AO. Then it makes sense. "Hey, the new drivers are a huge improvement in FPS!" Oh wait, AO defaults to off so IQ took a major hit, oh well - its FASTER!

Uhhh what? It's not removing anything. It's just adding in it's own SSAO. No point in doing it to games that already have it though (Crysis).
 
Uhhh what? It's not removing anything. It's just adding in it's own SSAO. No point in doing it to games that already have it though (Crysis).

Look at the Crysis screenshots someone posted with different AO levels. It kind of looks like the driver is disabling the games AO when AO == Off in the control panel. It is also obviously changing something when AO is set to different levels in the control panel, even though Crysis has its own AO.

Although it should be easy enough for someone to test. Just set AO = Off in the control panel, and then enable/disable Crysis's SSAO. If there isn't any change, then the driver is removing Crysis's AO :)
 
Look at the Crysis screenshots someone posted with different AO levels. It kind of looks like the driver is disabling the games AO when AO == Off in the control panel. It is also obviously changing something when AO is set to different levels in the control panel, even though Crysis has its own AO.

Although it should be easy enough for someone to test. Just set AO = Off in the control panel, and then enable/disable Crysis's SSAO. If there isn't any change, then the driver is removing Crysis's AO :)

Absolutely not. It isn't removing the AO, and it really has no way of doing that algorithmically. It doesn't know that AO is being done, and they didn't profile it to manually identify Crysis' AO method and circumvent it.

The SSAO is there in the screenshots unless the person taking the shots has manually disabled it in the console in devmode.

The driver can't just intelligently "disable AO" in games to apply its own. The option is there for games which don't have any SSAO functionality.

If you want to compare SSAO methods, take a shot of the Crysis built-in SSAO, then disable the in-game Crysis SSAO and compare that to the NVIDIA CP SSAO.

I can guarantee you the driver and control panel aren't trying to do something retarded like un-doing what the application is doing. They're simply adding to the scene.. never taking away.
 
for some reason i have the same driver but the AO isnt even on there. Do I have to have a certain vid card in order to get it?

9800gtx
 
Absolutely not. It isn't removing the AO, and it really has no way of doing that algorithmically. It doesn't know that AO is being done, and they didn't profile it to manually identify Crysis' AO method and circumvent it.

How do you know? They already have a Crysis profile in the drivers, it would be child's play to detect when the game tries to run the AO shader and then just not run it.

The driver can't just intelligently "disable AO" in games to apply its own. The option is there for games which don't have any SSAO functionality.

Then can someone take a screenshot showing AO in a game that doesn't have it? The source engine (HL2 EP2, L4D, etc..) has AO, or at least lists it as a feature of the engine.

If you want to compare SSAO methods, take a shot of the Crysis built-in SSAO, then disable the in-game Crysis SSAO and compare that to the NVIDIA CP SSAO.

That sounds like a good idea - can anyone do this? (I don't have an nvidia card or I would :) )

I can guarantee you the driver and control panel aren't trying to do something retarded like un-doing what the application is doing. They're simply adding to the scene.. never taking away.

Nvidia's drivers constantly take away, such as forcing quality instead of high quality. Remember, this IS the company that tried cheating in 3dmark ;)
 
Here's a comparison from HL2 ep 2. You can see the difference clearly if you flick between them. Turning it on totally kills the frame rate but it's a nice effect.

http://img113.imageshack.us/img113/7205/aooffgg9.jpg

http://img113.imageshack.us/img113/4622/aoonel3.jpg

Kind of looks like they did an edge-detect and then put a shadow on the edges. I know there is more going on, but that was just my first impression :)

Does look nice with the effect, but I'm pretty sure more games are just going to start including it themselves...
 
for some reason i have the same driver but the AO isnt even on there. Do I have to have a certain vid card in order to get it?

9800gtx

You need Vista. AO is a DX10 only option.


On a side note: These drivers fixed all kinds of instability issues I was having with 178.25 - 180.70 drivers while OC'ing my 8800GT. They also added a nice little kick in performance.

good drivers, but be warned. Some 260/280 owners are reporting black screens after installing them.

Make sure to do a complete uninstall of previous drivers, run driver cleaner and then install these.
 
How do you know? They already have a Crysis profile in the drivers, it would be child's play to detect when the game tries to run the AO shader and then just not run it.



Then can someone take a screenshot showing AO in a game that doesn't have it? The source engine (HL2 EP2, L4D, etc..) has AO, or at least lists it as a feature of the engine.



That sounds like a good idea - can anyone do this? (I don't have an nvidia card or I would :) )



Nvidia's drivers constantly take away, such as forcing quality instead of high quality. Remember, this IS the company that tried cheating in 3dmark ;)

First off, cheating at 3dmark has nothing to do with forcing a change in the look from what the developer intended for Crysis. They know better than to change what the developer is doing with their game purely to use their own SSAO.

Secondly both companies have been caught cheating, and not just at 3dmark, so your point is moot.

Do you really think they're arbitrarily going to be replacing shaders with their own effects, manually loading up each game and finding the shaders purely so they can replace the AO with something that isn't necessarily "better", just different, all while pissing off developers whose games they change the look of?

Trust me. I can guarantee you they're not replacing any original AO, and the last thing they'd want to do is set the precedent for profiling apps shaders to replace, as this would be a nightmarish task to do for all games.

I can't speak for the source engine as a whole, but L4D does not have AO, and I doubt anything preceding it in the Source evolution does either. It's pretty clear L4D doesn't have it here, in the AO Off shot:

AO Off:
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_OFF_3/

AO On:
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_ON_3/

Fallout 3:

Off: http://i42.tinypic.com/2v9xjl5.jpg
Low: http://i44.tinypic.com/ncbo0y.jpg
Med: http://i42.tinypic.com/6fmvwz.jpg
High: http://i40.tinypic.com/15gwnzn.jpg

Farcry 2:

Off: http://i42.tinypic.com/jawe3k.jpg
Low: http://i44.tinypic.com/2rcol1g.jpg
Med: http://i39.tinypic.com/2rhbmzt.jpg
High: http://i40.tinypic.com/33fcd8k.jpg


You need Vista. AO is a DX10 only option.

Actually no, the games are not DX10 which this works for.. it's clearly not just a DX10 option when it runs with DX9 games in Vista.

You do realize that Vista simply *offers* DX10 functionality, but it can not and does not force it, right? Games written in DX9 still run in DX9 in Vista unless they explicitly have a DX10 path.

Left 4 Dead, Fallout 3, Far Cry 2 all run a DX9 path, and FC2 also has a DX10 path.
 
You need Vista. AO is a DX10 only option.

This is an nVidia driver. They can enable all shader features they want inside their driver.
You can use the extra shader capabilities in Win XP through OpenGL extensions aswell.
You need DX10-level hardware to get support for AO probably, but neither the game nor the OS need to support it.
 
Actually no, the games are not DX10 which this works for.. it's clearly not just a DX10 option when it runs with DX9 games in Vista.

You do realize that Vista simply *offers* DX10 functionality, but it can not and does not force it, right? Games written in DX9 still run in DX9 in Vista unless they explicitly have a DX10 path.

Left 4 Dead, Fallout 3, Far Cry 2 all run a DX9 path, and FC2 also has a DX10 path.

Ok. Let me rephrase. In order to have the option available you must have DX10 installed. Period. It's been discussed 100 times over at Guru3D. In order to have DX10 installed you must have Vista. Sure, there is an "Alpha" of DX10 for XP out, but I don't consider that a trusted release.

So, for someone running XP they are not going to see the option in their NVCP.

No where did I mention that the option isn't usable in DX9 games on Vista.

Am I wrong here?
 
Ok. Let me rephrase. In order to have the option available you must have DX10 installed. Period. It's been discussed 100 times over at Guru3D. In order to have DX10 installed you must have Vista. Sure, there is an "Alpha" of DX10 for XP out, but I don't consider that a trusted release.

So, for someone running XP they are not going to see the option in their NVCP.

No where did I mention that the option isn't usable in DX9 games on Vista.

Am I wrong here?

This has nothing to do with "having DX10" installed. It has nothing to do with DX10, at all.

The feature is simply only in the Vista drivers, but that's not to say it couldn't be in the XP drivers too. The feature is probably only on Tesla class hardware which is what G80 and higher are based on. It just so happens that they're DX10 parts, but the key reason is the architecture, not support for DX10.

There is no DX10 for XP, and that alpha you're referring to is a hard coded hack to run a couple SDK samples, nothing more. It could never run a game in DX10 mode. It's not running DX10.
 
Nvidia's drivers constantly take away, such as forcing quality instead of high quality. Remember, this IS the company that tried cheating in 3dmark ;)

Hmmm, poor analogy. Texture filtering is a hardware accelerated, fixed-function process. There are no API guidelines for filtering quality as far as I know and AMD uses the same optimizations. But in any case, it's a far cry from manipulating application code.

I guess all the people interested in verifying this are either too lazy to do it themselves or dont have the hardware. I'm assuming this works on the Crysis demo too.
 
There are no API guidelines for filtering quality as far as I know and AMD uses the same optimizations.

If I'm not mistaken, there are certain basic rules for rasterizing and texture filtering accuracy for WHQL certification.
And these are more strict in DX10 than they were earlier.
 
You need Vista. AO is a DX10 only option.

No, it isn't. AO (Well, Crysis's version anyway) runs as a *SHADER*. It is NOT a feature of DX10, the hardware, or Vista. It will work on ALL DX9.0C CARDS. If this option doesn't exist in the XP version of the driver it is because nvidia chose not to put it there. It has absolutely NOTHING to do with needing Vista or DX10.

Granted, nvidia's version may be different, but AO definitely doesn't need DX10.
 
Hye guys anyone know why it wouldnt show up for me?? I have 185.20 but its not in the options. I am running windows 7 right now though.
 
Hye guys anyone know why it wouldnt show up for me?? I have 185.20 but its not in the options. I am running windows 7 right now though.

Don't know why but I know this has been mentioned.. others have said it doesn't work in Win7.
 
Back
Top