Rivatuner v2.05 released

quadnad

Supreme [H]ardness
Joined
Oct 24, 2005
Messages
7,656
What's new:
Updated databases for Detonator and ForceWare drivers. Added databases for ForceWare 163.69 and 163.71.
Improved driver-level overclocking module for NVIDIA display adapters:
Added new user interface for independent G8x GPU family shader clock control interfaces of the ForceWare 163.67 and newer drivers. New UI includes:
New independent slider for adjusting shader domain clock.
New "Link clocks" option allows you to use either traditional shader/ROP clock ratio based overclocking or completely asynchronous shader/ROP clock overclocking. Now you may either tick "Link clocks" options and adjust ROP clock only, allowing RivaTuner to overclock shader domain using VGA BIOS default shader/ROP clock ratio similar to pre-163.67 drivers, or untick "Link clocks" options and adjust domain clocks fully independently.
New overclocking profile format supporting independent shader clocks. Please take a note that old overclocking profiles are not supported by RivaTuner, so you must recreate previously existing overclocking profiles.
Previously available power user oriented ShaderClockRatio registry entry is now obsolete and it no longer exists in RivaTuner's database. Previously available ratio based shader domain overclocking functionality is now fully covered by new independent shader clock slider and new "Link clocks" option.
New user interface is provided by default under ForceWare 163.67 and newer drivers under Windows Vista, however Windows XP owners can also force Vista specific overclocking interfaces usage by setting NVAPIUsageBehaviour to 1. If needed, shader clock control can be forcibly disabled and old traditional overclocking module UI appearance can be forced by setting NVAPIShaderClockControl to 0.
Power user oriented adjustable minimum and maximum clock slider limits have been expanded from 25%-300% to 10%-800%.
Added experimental SLI overclocking for Vista. Please take a note that I have no SLI rig for testing and development, so this feature has been added blindly and RivaTuner still doesn't provide official SLI support.
Minor UI changes and improvements.

This new release allows control over the shader clocks. Finally G80 users will be able to overclock the shader domain. How long before someone like [H] does a proper performance comparison of various shader overclocks?

shaderclock.JPG


Download: http://downloads.guru3d.com/download.php?det=163
 
People already have done tons of tests with BIOS flashing different shaders and whatnot. It's well-known.

I knew they had BIOS flashes to do it, but nothing as easy as messing about with a slider.

Thanks for the link Kowan!
 
I knew they had BIOS flashes to do it, but nothing as easy as messing about with a slider.

Thanks for the link Kowan!

Definitely... what I meant though is that the performance differentials are well-documented :).
 
Definitely... what I meant though is that the performance differentials are well-documented :).

I'll admit I haven't been keeping up with G80 all that well : /

However, I am on the edge of my seat about D8E (or whatever you'd like to call it at this point). I'm hoping (praying) that we'll see it this November :D
 
is it just me, or was this more fun when one had to input specific ratios to increase the shader clock? :p

i can appreciate nifty sliders as much as anyone else tho, much more simple.


edit: i'd wager there are more than a few people downloading rt2.05 atm, my dl speed is crawling along at 9kb/s :D
 
This is great! I clocked mt GTS at 650/1523/1050 and Crysis plays smoother than ever! I average about 30-70fps constant depending where I am on the map now. Before it was only like 25 and in battles it would be choppy.
 
This is great! I clocked mt GTS at 650/1523/1050 and Crysis plays smoother than ever! I average about 30-70fps constant depending where I am on the map now. Before it was only like 25 and in battles it would be choppy.


/jealous :D
 
This is great! I clocked mt GTS at 650/1523/1050 and Crysis plays smoother than ever! I average about 30-70fps constant depending where I am on the map now. Before it was only like 25 and in battles it would be choppy.


for the core and memory did u flash the Bios.

and what temp do you get the that over clock?
 
Does anybody know if you can now program fan rates in SLI????
I noted the mention to SLI OCs, but didnt see anything about Fans.

Thanks.

Oh Yeah, thanks for the heads up.
 
What should the average max (24/7 safe) shader clock be? I know it's dependent on the card, but would 1500 be a good number to just leave it at? I got a nice 6-10FPS increase in Stranglehold and 3-5FPS increase in Clive demo with shader clock at 1500.
 
If you're using RivaTuner, setting a overclock in "performance 3D" has it dropping back to stock speeds when not gaming.

Memory speeds will remain at the overclocked speeds.
2D/3D profiles wont change correctly.?
by Unwinder:
Read NVIDIA 2D/3D overclocking basics. Memory clock is supposed to be the same for 2D/3D modes by design of most display adapters.
 
I know what you mean. I overclock, then revert back to stock speeds after gaming.
If the memory dropped back to stock speed like the core/shader, I'd leave it overclocked.
 
I know what you mean. I overclock, then revert back to stock speeds after gaming.
If the memory dropped back to stock speed like the core/shader, I'd leave it overclocked.

just set your fan to run at 100% and u shouldn't have any probs.
 
for the core and memory did u flash the Bios.

and what temp do you get the that over clock?

I never flashed a bios. I just moved the sliders and hoped for the best. I know I have an A3 revision card though.
 
Just curious also what a good OC is for the 8800GTS

Ive got 650\1600\1000 I took the shader up to 1600 without issues but afraid to push any higher... what is everyone else getting?

11,517 3dmark06
 
http://ozone3d.net/benchmarks/fur/

thats what i use to check for artifacts and stability.

for me my core and memory seem to have very high limits as long as i leave the shader alone, the most i can get out the shader on my card is 1512, with 660/1050 core/mem and with that score i can now get 9100 in 3dm06 for what thats worth.
 
I'm still using Rivatuner 2.04 but I have my eVGA 8800 GTS 320mb clocked at 675/1674/1008 and have been running that OC for awhile now stable. Anything past 1674mhz on the shader artifacts in Vista right away so thats about as high as I can OC it.
 
Mac[X-D];1031480033 said:
just set your fan to run at 100% and u shouldn't have any probs.

I'm running a HR-03 Plus. I'm not worry about the temps. I have no need to run it overclocked when not gaming.
It's simple to switch between a saved overclocked profile and stock.
 
I just OCd my 8800ultra to 675/2300 under the performance 3D tab, is there any problem with just having it at that speed all the time when widows starts up ?
 
Just curious also what a good OC is for the 8800GTS

Ive got 650\1600\1000 I took the shader up to 1600 without issues but afraid to push any higher... what is everyone else getting?

11,517 3dmark06

First off, since we are talking about the GFX Cards, how about SM2.0 and SM3.0 scores, so things are more comparable. Also, I assume you are talking about 1280X1024, but maybe not.

For me, I have two XFX 8800GTS 640MB cards that I bought in March and September. I know my September card is rev A3, but I never checked the March card. It was converted to MCW60 water cooling and is currently stored and waiting for the new case move to be complete following the install of the dual WC loops, so I can't easily check it ATM.

The card I got in March topped out at 675/1566/999, but that was before I knew how to separate the shader's clock from the core clock. Also, that's on water cooling, as I didn't even try to OC it much 'till I converted it to water.

My September card seems pretty decent. In my P180B (using 6 120mm case fans) and the card's stock HSF, it works well at 684/1620/1080. The temps are 62C in WoW, even after a couple hours. The ATITool 3D window can get it to 70C after 15 minutes or so, but I never saw it break 70C, even after an hour or more.

For now, I run at 1280X1024 in Vista (32-bit), when not running with dual monitors at 2560X1024 w/ certain games that support the whole dual-monitor thing.

My March card got these scores as part of an 11000 even run at 675/1566/999:

SM 2.0 Score 5064
SM 3.0 Score 4801​

My September card got these scores as part of an 11088 run at 684/1620/1080:

SM 2.0 Score 5188
SM 3.0 Score 5105​
 
for some reason I can't seem to keep my overclock when I restart the computer
even when I have "apply OC settings to windows startup" CHECKED
anybody else have this problem? or maby a solution?
 
I believe below that option is a picture of a floppy disc where you have to create and save a profile for the new settings you just created. After you create a profile have it selected then choose to load on startup and it will load that profile you just created with the floppy disc icon. Hope that helps.
 
I believe below that option is a picture of a floppy disc where you have to create and save a profile for the new settings you just created. After you create a profile have it selected then choose to load on startup and it will load that profile you just created with the floppy disc icon. Hope that helps.

yea I tried that, for some reason it won't let me save when my mem clock is over 800
this is seriously bugged for me
well on vista at least
 
I have weird problem, I have that slider but i cant really overclock, clocks revert to default every time :( I have this problem with forceware 163.44 -> 163.71 but 163.11 works well.
No matter what overclocking software I use.

OS is Vista X64 and GFX is 8800GTX.
I have seen other people reporting same kind of problem even with XP :O
Is there any cure?
 
I have weird problem, I have that slider but i cant really overclock, clocks revert to default every time :( I have this problem with forceware 163.44 -> 163.71 but 163.11 works well.
No matter what overclocking software I use.

OS is Vista X64 and GFX is 8800GTX.
I have seen other people reporting same kind of problem even with XP :O
Is there any cure?

Are you using the newest rivatuner?
 
Back
Top