GeForce 185.20 Released - Now With Ambient Occlusion

AuDioFreaK39

Limp Gawd
Joined
Jan 10, 2005
Messages
475
XFastest are the first to release the new GeForce 185.20 drivers, including a new version of Folding@home for NVIDIA GPUs.

http://www.xfastest.com/viewthread.php?tid=17608&extra=page=1

Links are now up:


Folding@home_GPU_v620nv


ForceWare 185.20 XP (32-bit)


ForceWare 185.20 XP (64-bit )


ForceWare 185.20 Vista (32-bit )

^ These drivers are confirmed working for Windows 7 Ultimate build 7000 as well.


ForceWare 185.20 Vista (64-bit )



Interestingly enough, these new drivers contain a new Ambient Occlusion setting in the Nvidia Control Panel:


ambientocclusionxo1.png
 
Thanks very much for the heads up.

Also, I'll go ahead and get the stupid question out the way: What exactly does Ambient Occlusion do?
 
Thanks very much for the heads up.

Also, I'll go ahead and get the stupid question out the way: What exactly does Ambient Occlusion do?

^ description is at the bottom in the picture above. The effect is very emphasized in Crysis especially. Here's a comparison I pulled up:

OFF:

aaoffwr0.jpg


ON:

aaonff0.jpg
 
Nice.. Any idea if 185.20 adds any other fixes/tweaks? Is it a DX9 or DX10 feature?

Any idea how much overhead that adds? It looks like it would be pretty resource intensive.
 
Wow thats a pretty noticable difference. My try out these drivers myself.
 
anyone tried this with a TripleHead2Go? Is this driver exposing 5040x1050 for the TH2Go?
 
LOL! :D
Ambient Occlusion needs to be supported under the game and Crysis have supported it long before for all GFX cards. This only brings scaling options in best case, since Nvidia and ATI already supports it.

The funny thing here, is that people are really going to broadcast this feature as next gen. People who were talking down on DX10.1. Ambient Occlusion is nothing but a faking of Global Illumination to reproduce some of the effect at least. I'm going to ROFL every time they post!

EA, SEGA, Blizzard and more have already signed up supporting DX10.1 in newer games and THQ already supported some of the features in their DX10.1 patch:

Added DX10.1 support.
* Added MSAA for alpha-tested objects (”alpha-to-coverage” for DX10.0, custom “alpha-to-coverage” for DX10.1 for better performance, native DX10.1 implementation for better quality).
* Added new quality level for sun-shadows (ultra) for better picture quality.
* Added new min/max shadowmap technique (which takes advantage of DX10.1 when available) allows to speed up high quality sunshafts rendering for top video boards and high resolutions.
http://www.btarena.org/games/s-t-a-l-k-e-r-clear-sky-update-1-5-07-reloaded

Read this what DX10.1 offers when it comes to global illumination:
http://ati.amd.com/products/pdf/DirectX10.1WhitePaperv1.0FINAL.pdf

More is coming and even Windows 7 will have native DX10.1. ;)

So for those who now speak highly about Ambient Occlusion which everyone gets, DX10.1 and DX11 will support an even better illumination method. :D
 
Crysis has always had this capability, it never necessarily relied upon drivers to deliver this option.

Even via DX9 mode, one could include in a system.cfg file the command:
r_SSAO=1 , to enable Screen-Space Ambient Occlusion.

Furthermore, one could control the quality and range of the AO via commands such as the following:
r_SSAO_quality = 2 (value between 0->3)
r_SSAO_radius = 2

The only question that really matters here is do these drivers in any way provide a performance boost when enabling AO now? Because doing so would usually cause Crysis to take a 7 FPS hit in other driver sets.
 
If these fix stuttering/hitching b.s. in L4D, I might consider it. I'm wondering how many iterations it'll take for nVidia to realize how they completely fucked that up.
 
If these fix stuttering/hitching b.s. in L4D, I might consider it. I'm wondering how many iterations it'll take for nVidia to realize how they completely fucked that up.

Yeah.
Any performance boost in GTA IV for that matter?

If so, I'll give these a shot, and also if anyone would care to test these to see if enabling AO now doesn't impact framerates as much as other drive sets in Crysis.
 
Ambient Occlusion needs to be supported under the game and Crysis have supported it long before for all GFX cards. This only brings scaling options in best case, since Nvidia and ATI already supports it.

Wow, how you do segway from this into marketing for DX10.1? Are you Charlie in disguise?

Anyway, Crysis and now FC2 have native AO but it seems Nvidia is trying to introduce the ability to force it on. Unless you have an alternative explanation for these drivers having AO support in L4D and HL2:EP2. Or why my framerate in HL2:EP2 tanks with it enabled.
 
If these fix stuttering/hitching b.s. in L4D, I might consider it. I'm wondering how many iterations it'll take for nVidia to realize how they completely fucked that up.

Amen on that 1 brotha!
 
Wow, how you do segway from this into marketing for DX10.1? Are you Charlie in disguise?
Could be. Looks very much like AMD propagandizing to me, given the references to AMD white papers and all.
 
Wow, how you do segway from this into marketing for DX10.1? Are you Charlie in disguise?

Anyway, Crysis and now FC2 have native AO but it seems Nvidia is trying to introduce the ability to force it on. Unless you have an alternative explanation for these drivers having AO support in L4D and HL2:EP2. Or why my framerate in HL2:EP2 tanks with it enabled.


Most likely its about scaling of AO, since it needs to be supported by the game in the first place.

Who's Charlie? Some kind of strawman you decided to use? If you cannot see the connection betweeen AO and global illumination, there is no point in discussing this with you.

One of the most wanted (by me) features of DX10.1 is more use of the global illumination method which is more efficiant in DX10.1. This is something that has been marketed with the introduction of DX10.1. If you are marketing AO, you are marketing DX10.1 features as well. ;) Don't see the connection between AO and DX10.1?

If you say "WOW, good feature", to Ambient occlusion, then consider this:

Ambient occlusion is a "fake" global illumination technique, its faster and it looks similar, however it is a very crude approximation to full global illumination
http://wiki.answers.com/Q/What_is_difference_between_ambient_occlusion_and_global_illumination

DX10.1 provides a faster (then DX10) "real" global illumination, which looks even better and more realistic.

@phide:
Calling it AMD propagandizing is really very wrong, considering DX10.1 is an open standard all GFX manufacturers can use if they wish. Its not ATI exclusive. Not everything is about Nvidia vs. ATI.

Don't you like AO? Do you want it to look better? Its something DX10.1 offers. Indexable multiple cubemap arrays which would give you scalable global illumination. Doesn't it sound sweet? Don't you want it in future games?
 
Calling it AMD propagandizing is really very wrong, considering DX10.1 is an open standard all GFX manufacturers can use if they wish. Its not ATI exclusive.
Correct, but it is currently an AMD-exclusive at the moment. GT200 only offers certain DX10.1 extensions at this time, and NVIDIA can't claim DX10.1 compliance on any of their chips. It just seems a bit out of place for you to be heralding certain advantages of DX10.1 and citing AMD white papers, that's all. It looks like propagandizing.

Charlie is actually Charlie Demerjian, a writer for The Inquirer who exhibits a blatant pro-AMD bias.
 
If these fix stuttering/hitching b.s. in L4D, I might consider it. I'm wondering how many iterations it'll take for nVidia to realize how they completely fucked that up.

Disable multi-core rendering in the game options. Game runs perfectly smooth.
 
Correct, but it is currently an AMD-exclusive at the moment. GT200 only offers certain DX10.1 extensions at this time, and NVIDIA can't claim DX10.1 compliance on any of their chips. It just seems a bit out of place for you to be heralding certain advantages of DX10.1 and citing AMD white papers, that's all. It looks like propagandizing.

I could cite Microsoft papers instead, but the AMD paper was the first I found to link up my statements. It doesn't matter, since its all true even if its an AMD paper. That AMD and S3 chose to support it doesn't make it an AMD only product.

Given that many people just evangelize a company, I would understand your position, though I don't accept it. Its about the tech and downplaying of it earlier. I WANT Nvidia to support it and they will though DX11 anyway if they want or not themselves.

This is why I ROFL when people who have downplayed dx10.1 (who also gave us faster AA) now hails AO. Those people are more about propaganda then the tech itself and what it can bring gamers.

AO is a cheap fake form of global illumination, but better then without any GI IMO. Global illumination is scalable in DX10.1 and faster (multiple indexable cube maps in DX10.1 vs. single non-indexable cube maps in DX10). Given what AO does and what DX10.1 offers in GI, you probably understand why I bring up the parallells.

I'm talking about the tech, not who supports it or not. This is an open tech which all companies can support upon wish and non of the GFX companies control like PhysX.

Linking me to an Inquirer writer is strawman argumentation. Please don't do it again, since it brings a discussion down to a low level. I'm not biast to any of them and have had cards from both, often at the same time since I have several computers. ATI and Nvidia are only companies wanting money. I care more about gamers themselves regardless of hardware and have offered support for them for years. I try to build up upon my statements and what I say I mean.

I do get some laughs though when people who downplayed DX10.1 now are hailing AO. :D It has nothing to do with Nvidia (though I get some laughs there too considering that they have requested support for DX10.1 features in games, while they downplayed it even still). If they would have implemented native support, it wouldn't require games like Farcry 2 to have a Nvidia only rendering support. :D

PS: I would laugh if ATI did the same!

Edit: Being interested in tech, I wondered if someone could test the feature. Here is a way:
http://unigine.com/press-releases/080904-tropics/
This demo supports AO and hopefully you can then see how the different settings in Nvidia CP's AO effect the demo. :)
 
Most likely its about scaling of AO, since it needs to be supported by the game in the first place.

Could you explain exactly how the driver would control that considering that the game's AO implementation would be in a proprietary pixel shader?

If you cannot see the connection betweeen AO and global illumination, there is no point in discussing this with you.

No I'm asking you why are you coming into a thread about an Nvidia AO option just to bash it and carry on about DX10.1 and global illumination? What exactly are you trying to accomplish? You're saying we should ignore it because it's not as good as GI? Well ummmm, great but where are all those real-time GI implementations?
 
FFS with the fanboi arguing. Answer the damn question. Is there any performance increase or fixes in these or not? :mad: Get to the damn point, fuck DX10.1 bullshit.

Newls1, what part of MULTI-CORE is hard to understand? More than 1 is multi. Dual, quad, octo, its all multicore.

Thank you

~Grumpy~
 
AO doesn't require any specific driver support. OpenGL has supported it for years now as it's a very basic (albeit very intensive) algorithm. My company's game engine (OGL) supports it as well.

Note that AO takes 5-10 times as much processing time as rendering the scene's models and everything if AO is done at full resolution. If the game engine developer is smart he reduces the resolution, significantly speeding things up. The visual difference will be negligible.

Of course, with a forced setting such a refinement can not be used and performance in the game will be halved at least :)
 
FFS with the fanboi arguing. Answer the damn question. Is there any performance increase or fixes in these or not? :mad: Get to the damn point, fuck DX10.1 bullshit.

Newls1, what part of MULTI-CORE is hard to understand? More than 1 is multi. Dual, quad, octo, its all multicore.

Thank you

~Grumpy~


Wow, so you feel better by making someone feel like an idiot? To me multi-core means MORE THEN 2. Kinda like the meaning of a "couple" or a "few"

Really no need to be so rude about things, there are better ways to make a statement.
 
I'd like to see some user screenshots of Ambient Occlusion enabled vs disabled.
 
Could you explain exactly how the driver would control that considering that the game's AO implementation would be in a proprietary pixel shader?

I'm no game programmer, so I cannot explain exactly (and probably couldn't even if I was, considering I'm not working for Nvidia either making their drivers). ;)

I can make a suggestion though for discussion sake. Computation in pixel shaders, but doesn't it have a texture map with the geometry? If to reduce the surface of the elements that are being applied AO, wouldn't it then be possible to scale the AO? Kinda like mipmap scaling which also can be done in drivers?

As I said, I don't work for Nvidia, so I can't explain exactly how they did it in their drivers. Their drivers aren't open code either, so unless you work for Nvidia, you don't know either.

As for your games working with AO:
http://www.moddb.com/mods/city-17-episode-1

You can even get HL1 to work with SSAO on any card. Source engine have it built in.

No I'm asking you why are you coming into a thread about an Nvidia AO option just to bash it and carry on about DX10.1 and global illumination? What exactly are you trying to accomplish? You're saying we should ignore it because it's not as good as GI? Well ummmm, great but where are all those real-time GI implementations?

The thread is about AO as well as Nvidia. AO is an illumination technique to simulate GI.

I hate to repeat myself, but I'm not bashing Nvidia's AO option. Perhaps its to you, but not to me an ATI vs. Nvidia. I'm not discussion ATI at all, just GI. I welcome any free option to tweak more to the liking.

And, don't put words in my mouth. I haven't said ignore AO. Global illumination is not only DX10.1 either, but its faster there. The original Stalker supported global illumination as well through DX9.

What I do say, is that its funny that people have downplayed DX10.1 where a more effective global illumination was one of the main features and now that there is a feature supporting probably some scaling of AO, its touted as something big? With indexable multiple cube map arrays, you would have a more scalable GI already. If Nvidia would have provided support for DX10.1 instead of putting AO in drivers later, we would all have this much sooner in all games. Its still coming, but more delayed.

Don't you see the irony? :D

Nvidia downplays DX10.1 saying it doesn't add anything special.
Nvidia adds some DX10.1 features (halfway instead of full) anyway and needs special support for them in games. Go figure, when DX10.1 didn't add anything special...
Indexable multiple cube map arrays (global illumination) vs. single cube map array(global illumination) is one of DX10.1 major features.
Some Nvidia users downplays DX10.1 saying it doesn't add anything special.
Ambient Occlusion is a method for illumination like global illumination to simulate some of GI's effects for more realism.
Nvidia adds options for AO selection in the drivers.
Suddenly something special is added. WOW!

You don't find this funny? ROFL! :D

What to accomplish? Discussing AO means its relevant to discuss GI (which effects AO simulates anyway). Let people see the irony when downplaying DX10.1. Perhaps now people will see the benifits more clearly. Its worthy to have for gamers and I want Nvidia to implement it in their new cards as well as game developers giving DX10.1 path. Not just for faster AA, but for DX10.1 GI.

With Nvidia on board, we would have seen more global illumination already.
I would have preferred Nvidia cards to have DX10.1, but they left it out by choice.

Btw. Could you test the function on different levels using the Tropics demo I linked above to see the function? It has support for AO and can easily be switched on/off for testing purposes. :)
 
Could you explain exactly how the driver would control that considering that the game's AO implementation would be in a proprietary pixel shader?



No I'm asking you why are you coming into a thread about an Nvidia AO option just to bash it and carry on about DX10.1 and global illumination? What exactly are you trying to accomplish? You're saying we should ignore it because it's not as good as GI? Well ummmm, great but where are all those real-time GI implementations?
well if you can answer the question where are all those great dx10 games or why we have to play crappy ports you will have your answer i think :rolleyes:
 
I'm really pissed about nvidia being so proud of their AO; why didn't they have it all along? Why the hell didn't they have dx10.1 support starting with the gtx 2x0 series?

It just goes to show you that they don't give a rat's ass about good aa. And due to the fact they have so much influence over developers, they're preventing many games from supporting hw aa, b/c they don't support dx 10.1.

They seriously need to cut this bullshit and quit praying on peoples' ignorance.

No this is not a trolling post, but they don't do jack shit right and I can't get over it, and I shouldn't have to deal with this unacceptable bs and arrogance from nvidia, b/c there's no competent alternative, and b/c i played games well before half the people at nvidia even knew what doom was.
 
It seems its only game tweaks. Can't say for sure, but I checked the game support list and those can be tweaked for same effect:


Assassin Creed
Bioshock
COD4
COD:WAW
CS:Source
Company of heroes
crysis (warhead not suported)
.gif may cry 4
Fallout 3
Far Cry 2
Half-life 2: Episode Two (only!!)
Left 4 Dead
Lost Planet
Mirror's Edge
The Call of Juarez
World in conflit
World Of Warcraft

http://laptopvideo2go.com/forum/index.php?showtopic=22520

Doesn't seem it needs special hardware support either from what I've read (the OP have been spammed all over internet) :p

Its a nice feature anyway,though funny how its been promoted thinking of the downplay of DX10.1's faster GI. :D
 
Computation in pixel shaders, but doesn't it have a texture map with the geometry? If to reduce the surface of the elements that are being applied AO, wouldn't it then be possible to scale the AO? Kinda like mipmap scaling which also can be done in drivers?

Texturing and dynamic mip-map generation are still fixed function features. AO is just a pixel shader. There's no way for the hardware or driver to know that it's an AO shader that's running.

Don't you see the irony? :D

I would, if this AO option was being touted as something big as you claim. But thus far this is just a leaked driver. We have no idea how it's implemented or how Nvidia plans to market it or how extensive the game support will be.

You make a fair point though - Nvidia really can't pump up AO while openly dismissing DX10.1. One critical factor is missing though, and that is evidence of real-time GI performance using indexable cube maps. ATi's ping-pong demo is rudimentary at best.
 
8800GTS 512 Sli and I couldn't get these to run in SLI for Fallout 3. Single GPU mode worked but the FPS was about 5-6 FPS lower.I couldn't tell the difference with AO on, if that is true this game is supported. But it's DX10 only, right? I also have heard FO3 supports DX10 but have never seen that explained, but I bet it would run better under 9 even if 10 was supported.
 
^ description is at the bottom in the picture above. The effect is very emphasized in Crysis especially. Here's a comparison I pulled up:

OFF:

aaoffwr0.jpg


ON:

aaonff0.jpg

Hmmm....I think it looks better OFF....But, TBH, aside from some of the leaves and that rooftop having a bit more sun shining on it, I really do not see any differences.
 
Having it on gives it more realistic lighting and gives the image some depth as far as the vegitation goes. That's how it looks to me anyway.
 
Big Bang 4? Shadows look a little more realistic and prevalent with it on.
 
Back
Top