GeForce 185.20 Released - Now With Ambient Occlusion

Hardware-Infos.com has its own review up of the drivers now. Please check it out:

http://www.hardware-infos.com/tests.php?test=52

hardwareinfosgeforce185or2.jpg
 
after I rolled back to the 181.00 drivers from these I noticed i had two lines for "Nvidia control panel" in the context menu you get with a right click on the desktop. These drivers use a different registry key to trigger the Nvidia context menu item, so I had tor delete one of them to get that normal again.

what I had was: (in HKEY_CLASSES_ROOT\Directory\Background\shellex\ContextMenuHandlers\)

NvCplDesktopContext (saved this one)
NvDesktopContext (deleted this one)

Anyways.. I even cleaned with a driver cleaner and it didn't catch this.
 
Does Ati's drivers offer this feature? I'd like to use it in my games. If Ati doesn't offer it then maybe its time to start looking for a new video card :) I really like how it looks with it on.
 
Installed these and there was no drop down for Ambient Occlusion! Is this something only for the newer cards? I have an older GTX260 192?
 
Make sure you uninstall the prev drivers, then install using ''setup'', ''pdsetup'', and install physx as well. Mines there
 
yeah I did that, uninstalled old, booted into safe mode ran driver cleaner,restarted,reinstalled using setup with admin rights all of them and no go! I'm running the latest windows 7 beta dont know if that might have anything to do wih it
 
According to the site from the op's site, it works with it.

It should look something like this.
 
Ok tried re-installing again and still a no go! Maybe it is only a 216 core feature? I have the older 192 core
 
Texturing and dynamic mip-map generation are still fixed function features. AO is just a pixel shader. There's no way for the hardware or driver to know that it's an AO shader that's running.

Yes it can. Drivers override what the game does all the time, that is why performance changes between driver releases. It would be child's play for nvidia to swap out Crytek's AO pixel shader with one of their own at run time. Remember the old ATI "Chuck" driver? That modified the Oblivion rendering path to use features of ATI cards to allow HDR + AA. Its nothing new for drivers to alter how a game is doing rendering.

Does Ati's drivers offer this feature? I'd like to use it in my games. If Ati doesn't offer it then maybe its time to start looking for a new video card :) I really like how it looks with it on.

It isn't a driver feature, it is a game feature. Both ATI and Nvidia have had AO for a very, very long time as it is just a pixel shader (it might even run on cards as old as the 9700 Pro, depending on if it needs DX9.0c features or not). No one except Nvidia knows what the drop down actually does yet.
 
Windows Vista Ultimate 64-bit
GeForce 185.20
Nvidia PhysX 8.11.18
Crysis: WARHEAD
DirectX 10

2048 x 1152 (16:9)

Core i7 965 Extreme @ 3.74GHz
GeForce GTX 280 @ 685/2240

AO OFF (19fps)

aooff19fpsme6.jpg


AO LOW (17.8fps)

aolow178fpsye0.jpg


AO MEDIUM (16.6fps)

aomedium166fpsut7.jpg


AO HIGH (13.1fps)

aohigh131fpshg7.jpg
 
yeah I did that, uninstalled old, booted into safe mode ran driver cleaner,restarted,reinstalled using setup with admin rights all of them and no go! I'm running the latest windows 7 beta dont know if that might have anything to do wih it
That might be it.
I have no drop-down for it either on Windows 7.
 
Hmmm....I think it looks better OFF....But, TBH, aside from some of the leaves and that rooftop having a bit more sun shining on it, I really do not see any differences.

I did notice huge difference with the shadow. they are darker, the shadow casted on the tree trunks and vegetation looks more realisitc
 
I made a comparison for Left 4 Dead.

Ambient Occlusion ON (High)
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_ON_2/

Ambient Occlusion OFF
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_OFF_2/
___________

Ambient Occlusion ON (High)
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_ON/

Ambient Occlusion OFF
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_OFF/


It does give the game a nicer, "fuller" look, but the performance hit is pretty massive. Went from ~75 FPS to 20.

___________


AO ON
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_ON_3/

AO OFF
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_OFF_3/
 
Windows Vista Ultimate 64-bit
GeForce 185.20
Nvidia PhysX 8.11.18
Crysis: WARHEAD
DirectX 10

2048 x 1152 (16:9)

Core i7 965 Extreme @ 3.74GHz
GeForce GTX 280 @ 685/2240

AO OFF (19fps)

http://img171.imageshack.us/img171/8499/aooff19fpsme6.jpg

AO LOW (17.8fps)

http://img171.imageshack.us/img171/4083/aolow178fpsye0.jpg

AO MEDIUM (16.6fps)

http://img171.imageshack.us/img171/7163/aomedium166fpsut7.jpg

AO HIGH (13.1fps)

http://img211.imageshack.us/img211/6637/aohigh131fpshg7.jpg

Interesting. I wonder what the driver is doing to the game's SSAO and how this compares to just setting the game's SSAO quality levels using a tweaked INI

I made a comparison for Left 4 Dead.

Ambient Occlusion ON (High)
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_ON_2/

Ambient Occlusion OFF
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_OFF_2/
___________

Ambient Occlusion ON (High)
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_ON/

Ambient Occlusion OFF
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_OFF/


It does give the game a nicer, "fuller" look, but the performance hit is pretty massive. Went from ~75 FPS to 20.

I honestly cannot spot any difference at all between those shots. I can in the Crysis ones, but I don't see anything changing (other than your character and the camera position, of course). Well, in the first set the buildings on the bottom left are darker, but I think its supposed to be distance fog like with AO off so I would consider that a loss of quality (or a bug)
 
I honestly cannot spot any difference at all between those shots. I can in the Crysis ones, but I don't see anything changing (other than your character and the camera position, of course). Well, in the first set the buildings on the bottom left are darker, but I think its supposed to be distance fog like with AO off so I would consider that a loss of quality (or a bug)
In the first shot, look at the buildings to the right of the apartment.

In the second shot, look at where the buildings meet the ground and the poles near the fire.

And here is a much better comparison. The effect is more pronounced indoors. Not very playable on this 8800 GTS 512, though.

AO ON
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_ON_3/

AO OFF
http://www.wegame.com/view/FW_185_20_Ambient_Occlusion_OFF_3/

Camera position didn't change this time because I figured out that you could change the setting without having to restart the game. :p
 
Yes it can. Drivers override what the game does all the time, that is why performance changes between driver releases. It would be child's play for nvidia to swap out Crytek's AO pixel shader with one of their own at run time. Remember the old ATI "Chuck" driver? That modified the Oblivion rendering path to use features of ATI cards to allow HDR + AA. Its nothing new for drivers to alter how a game is doing rendering.

No it can't, not in any sort of universal fashion. There's no such thing as an "AO" shader that the driver can automatically detect and replace. That's what I'm referring to. The shader replacement you refer to is simply the driver maintaining a database of games and specific shaders that the drivers look for which are replaced when found. There's nothing generic or automated about that.

Also, shader replacement doesn't exactly work as an explanation when there's no shader to replace does it?
 
Great, more features to make Crysis look even better and be even more completely unplayable.
 
No it can't, not in any sort of universal fashion. There's no such thing as an "AO" shader that the driver can automatically detect and replace. That's what I'm referring to. The shader replacement you refer to is simply the driver maintaining a database of games and specific shaders that the drivers look for which are replaced when found. There's nothing generic or automated about that.

Also, shader replacement doesn't exactly work as an explanation when there's no shader to replace does it?

I have yet to see a case where AO is being enabled in a game that doesn't have it by default. Drivers have had per game optimizations for a very, very long time now. The drivers are already maintaining a DB of games and optimizations, and I highly doubt this is universal. The source engine (on which L4D runs) has AO by default.

What could also be happening is that the AO drop down is running another AO pixel shader ON TOP OF whatever the game is doing. That would both be universal and apply to games that don't have it.
 
The source engine (on which L4D runs) has AO by default.

Do you have a source for this? This also works on Fallout 3 and I haven't seen any reference to real-time dynamic AO in that title before either. Bioshock as well. If you do have sources showing that these titles support AO out of the box it would be nice to see them.

What could also be happening is that the AO drop down is running another AO pixel shader ON TOP OF whatever the game is doing. That would both be universal and apply to games that don't have it.

Yep, that's what I and others are suggesting.
 
Do you have a source for this? This also works on Fallout 3 and I haven't seen any reference to real-time dynamic AO in that title before either. Bioshock as well. If you do have sources showing that these titles support AO out of the box it would be nice to see them.

http://developer.valvesoftware.com/wiki/Source_Engine_Features
Self-shadowed Bump Maps create soft shadows and ambient occlusion with both dynamic and pre-calculated radiosity lighting. Source renders self-shadowed bump maps on both current and older-generation graphics hardware.

Fallout 3 uses the Gamebryo engine
http://www.emergent.net/en/Multimedia/Demos/
Emerge, originally designed for the 2007 Game Developers Conference (GDC), is a technical demo showcasing several Gamebryo systems, including camera, shadow, and ambient occlusion.

The Unreal 3 Engine also has Ambient Occlusion
http://www.digitalbattle.com/2008/02/22/gdc-features-of-next-unreal-engine-3-version-revealed/

I think that covers all the games you just listed :)

Yep, that's what I and others are suggesting.

But that doesn't sit right with me. That is like nvidia saying it knows how the game should look better than the developers, and no dev is ever going to go for that (especially not since the pixel shader source for SSAO is freely available and rather easy to implement). And, of course, staking AO effects on top of each other will not only nedlessly slow the game down, but could also make dark, moody games damn near impossible to play with the shadows being twice as dark.
 

Yes, it does! It's not clear that these games do actually use the engine feature though. Unfortunately, besides Crysis I don't think any of these allow you to disable the in-game AO which would be an easy way to veryify.

But that doesn't sit right with me. That is like nvidia saying it knows how the game should look better than the developers, and no dev is ever going to go for that (especially not since the pixel shader source for SSAO is freely available and rather easy to implement). And, of course, staking AO effects on top of each other will not only nedlessly slow the game down, but could also make dark, moody games damn near impossible to play with the shadows being twice as dark.

I agree. If these titles already employ AO out of the box the option doesn't make sense. Because the devs would have setup IQ in such a way for it to be compatible with the level of AO they're using.
 
Texturing and dynamic mip-map generation are still fixed function features. AO is just a pixel shader. There's no way for the hardware or driver to know that it's an AO shader that's running.

It would still need a texture map for geometry, wouldn't it? I read up a bit upon Nvidia's use of AO and a method used is to convert a polygon mesh into disk shaped elements. If density of light and shadow or surface is reduced (like in mipmap scaling in drivers), wouldn't it then make it scalable?

Games used already have AO in their engine and can be enabled regardless of GFX card fron ATI or Nvidia. It seems they are using that though.

I would, if this AO option was being touted as something big as you claim. But thus far this is just a leaked driver. We have no idea how it's implemented or how Nvidia plans to market it or how extensive the game support will be.

I wasn't talking about Nvidia touting it, but rather users who earlier downplayed dx10.1.:)

You make a fair point though - Nvidia really can't pump up AO while openly dismissing DX10.1. One critical factor is missing though, and that is evidence of real-time GI performance using indexable cube maps. ATi's ping-pong demo is rudimentary at best.

No, they can't. Isn't the Gather4 texture fetch in DX10.1 more optimal for AO also? Meaning that Dx10.1 would give faster AO as well as GI?

Cube maps have been introduced in DX10 already (COH being first I think), but they are created one per pass like AA. DX10.1 offers multiple read/writes per pass with indexable cube maps which can be used for cloning as far as I understood.

Indexable cube map performance is demonstrated pretty nicely here:
http://www.humus.name/index.php?page=3D&ID=80

Not the most beautiful demo's, but it absolutely shows the potensial of speed which have been the achilles heel of GI. AO can only give so much of realism and GI can give so much more. :)
 
Ive always felt that HDR overbleached scenery. AO seems to take more accurately into account where lighting should be softer, where an area is too well lit by HDR and objects blocking light sources should make it not as intense.

I have no expertise in any shaders etc. But based on my experience with it, this is what it seems to do. Although what im wondering is why isnt the HDR shaders just being modified to do this automatically without having a second algorithm to tone it down. It just seems like paying a painter to paint a fence, and the painter paying another guy to do the painting for him.

Dynamic light sources have always felt too cluttery.
 
Ive always felt that HDR overbleached scenery. AO seems to take more accurately into account where lighting should be softer, where an area is too well lit by HDR and objects blocking light sources should make it not as intense.

I have no expertise in any shaders etc. But based on my experience with it, this is what it seems to do. Although what im wondering is why isnt the HDR shaders just being modified to do this automatically without having a second algorithm to tone it down. It just seems like paying a painter to paint a fence, and the painter paying another guy to do the painting for him.

Dynamic light sources have always felt too cluttery.

HDR and AO do different things. HDR is the range of a light. AO is the effect of lighting bouncing off of walls and such (so is global illumination). They do different things, hence why they aren't "combined". Also, AO depends upon the results of the HDR pass, so afaik they can't run at the same time.

HDR: Simulates your eye's iris.
AO: Simulates bouncing light.

Get it?

The reason dynamic light sources feel cluttery is because they are way way too expensive to do correctly, so there are a whole bunch of techniques to fake it well enough that the viewer doesn't know the difference. AO is just another means of faking global illumination, since real GI was too expensive/not feasible before DX10.1. Ray tracing, for example, gives very realistic lighting and is very, very simple to code (like you can write a ray tracing engine in a weekend), but it is far too expensive and slow.

Yes, it does! It's not clear that these games do actually use the engine feature though. Unfortunately, besides Crysis I don't think any of these allow you to disable the in-game AO which would be an easy way to veryify.

Why wouldn't they use the engine feature? It doesn't require any extra coding work or graphics work to enable AO. It is one of those things that just works.

That said, you can enable a display of the occlusion engine in process in Left4dead. Open the console, type "find occlusion" and you can see the two debug options you can enable (after setting "sv_cheats 1", of course). I haven't figured out how or if it is possible to control the quality of the game's AO, however :)
 
HDR and AO do different things. HDR is the range of a light. AO is the effect of lighting bouncing off of walls and such (so is global illumination). They do different things, hence why they aren't "combined". Also, AO depends upon the results of the HDR pass, so afaik they can't run at the same time.

HDR: Simulates your eye's iris.
AO: Simulates bouncing light.

Get it?

The reason dynamic light sources feel cluttery is because they are way way too expensive to do correctly, so there are a whole bunch of techniques to fake it well enough that the viewer doesn't know the difference. AO is just another means of faking global illumination, since real GI was too expensive/not feasible before DX10.1. Ray tracing, for example, gives very realistic lighting and is very, very simple to code (like you can write a ray tracing engine in a weekend), but it is far too expensive and slow.



Why wouldn't they use the engine feature? It doesn't require any extra coding work or graphics work to enable AO. It is one of those things that just works.

That said, you can enable a display of the occlusion engine in process in Left4dead. Open the console, type "find occlusion" and you can see the two debug options you can enable (after setting "sv_cheats 1", of course). I haven't figured out how or if it is possible to control the quality of the game's AO, however :)

But AO is just shadowing (when occluded) whereas GI can cause the color of the room to change based on this bouncing light. I.E. with GI if a light bounces off a blue wall, you get a subtle blue hue on the wall that light lands on. This doesn't happen with AO. So they're pretty much very different.
 
But AO is just shadowing (when occluded) whereas GI can cause the color of the room to change based on this bouncing light. I.E. with GI if a light bounces off a blue wall, you get a subtle blue hue on the wall that light lands on. This doesn't happen with AO. So they're pretty much very different.

I wouldn't call that very different. GI does a bit more, and is higher quality. AO is a subset of GI.
 
That said, you can enable a display of the occlusion engine in process in Left4dead. Open the console, type "find occlusion" and you can see the two debug options you can enable (after setting "sv_cheats 1", of course). I haven't figured out how or if it is possible to control the quality of the game's AO, however :)

That shows occlusion culling, not ambient occlusion. Occlusion culling is when things out of view don't get rendered. The console commands draw everything that is culled green.
 
I'm running dual GTX 280s in SLi and I cranked AO on high in Left 4 dead, it was a massive framerate hit.

on low it's not noticable at all, but does offer some nice improvments. on medium or high I can notice the framerate drop
 
That shows occlusion culling, not ambient occlusion. Occlusion culling is when things out of view don't get rendered. The console commands draw everything that is culled green.

Ah, I didn't actually try any of the commands as I was playing online at the time (so I couldn't do sv_cheats 1 to try it). Do you know any console commands to adjust AO in L4D?
 
It would still need a texture map for geometry, wouldn't it? I read up a bit upon Nvidia's use of AO and a method used is to convert a polygon mesh into disk shaped elements. If density of light and shadow or surface is reduced (like in mipmap scaling in drivers), wouldn't it then make it scalable?

I'm not sure what you mean by "density of light and shadow or surface is reduced" but the Crytek approach is to sample the depth buffer around the given pixel and depending on the distance between depth buffer samples in each direction it guesstimates the amount of occluding geometry. Scaling is achieved by increasing the number of samples or the range of the sampling.

Crysis Section 8.5.4.3
http://delivery.acm.org/10.1145/129...coll=ACM&dl=ACM&CFID=15151515&CFTOKEN=6184618

Starcraft II
http://developer.amd.com/gpu_assets/S2008-Filion-McNaughton-StarCraftII.pdf

No, they can't. Isn't the Gather4 texture fetch in DX10.1 more optimal for AO also? Meaning that Dx10.1 would give faster AO as well as GI?

Not necessarily, DX10.1 allows access to the multi-sampled depth buffer. But this algorithm seems to work fine at either pixel level or even lower resolution than that. Starcraft II is using the technique and it's using a DX9 engine. Gather4 wouldn't be relevant here as the location of the individual depth buffer samples aren't in a uniform 2x2 block that Gather4 would return.

Cube maps have been introduced in DX10 already (COH being first I think), but they are created one per pass like AA. DX10.1 offers multiple read/writes per pass with indexable cube maps which can be used for cloning as far as I understood.

True, what I'm saying is that even with the speedup afforded by indexable cube maps we have no real evidence that current hardware is capable of real-time GI using DX10.1. So far all we have are very small demos. Of course having it would be great but if we did there's no guarantee that we would have usable GI today.
 
I know this may sound really tacky of me, but are there any screenshots of this new AO setting looks like in games that don't have some measure of AO in them by default? I kinda want to see what this looks like in a game like WoW or CS:S.
 
I know this may sound really tacky of me, but are there any screenshots of this new AO setting looks like in games that don't have some measure of AO in them by default? I kinda want to see what this looks like in a game like WoW or CS:S.
I tried AO in CS:S. I couldn't see a difference. I took identical screenshots of the setting off and on and they were exactly the same.
 
Back
Top