Development Thread for ATI overclocking tool - v0.0.19 out

WOW

this thing is kickass

great job man

I love how it can find artificats that are not even visible to human eye, keep that for sure

lets you know that even if you can't see them that you are now at the edge of the limit of your card

really great work
 
Does it work with 8500s? If not please add :D
 
OK, so I've finally tried it out and I must say that I am impressed. However, I think the approach it takes for finding max OC is a bit too aggressive.

When trying to find the max core/mem, the clock speeds are incremented way too quickly. More time should be given to each setting before moving on to the next. How much more? I can't say but 1 second is barely enough.

What I have found is that when trying to find the max core, the core speed climbs so quickly that by the time the app finds artifacts, it has to spend an inordinate amount of time slowly backing down to a safe clock. Why not take a moderate amount of time climbing up instead?

The memory, however, is a more dire story. The memory speeds ramp up so quickly that by the time aftifacts are found, they come along with major desktop corruption requiring a reboot. For this reason the find max mem is not very useful to me right now.

Note that I spent some time heating up the GPU and memory w/ 3DMark before I used the utility because I figured it would be helpful to get them heated before I turned such a program loose.

All this being said, it is a very cool app. Unfortunately, what it thinks is a max core speed gives me white dots in the water in the nature demo. Which has gotten me wondering...

This util found 420 as the max whereas I get the white dots at anything above 410. I only get these dots in this demo so far as I can tell. I played KOTOR for close to three hours last night at 420 with nary a hickup or artifact. I'm beginning to take the opinion that a GPU OC doesn't need to be as perfect and error free as a CPU OC. But that could be a whole other thread.
 
Alt-F5 flushes memory and clears artrifacts without reboot.
 
@stangman: dont think so :)

@jmcmike: thx for the input .. i'll increase the delay in the next release ..

i agree with you that gpu "stability" is not as important as cpu stability .. question is: are artifacts that you can't see considered artifacts? the problem is here where to draw the line ..
the comparison algorithm doesnt really know when you start noticing the artifacts .. different people might notice them at different times ..

i'll experiment with code that takes into account the color difference from what the pixel is to what it should be ..
 
Is I like the proggie w1zzard - some nice stuff there!!!

Any chance of gettting it as a tab in advanced settings ?

Mabey even 2 versions!!
 
Originally posted by Meaker
Alt-F5 flushes memory and clears artrifacts without reboot.

It also kills your videocard.


I just did this and instantly my computer screen got black. Now, I cannot get it to turn back on without going in VGA mode. Even then, if I ever change the refresh rate from 60 to anything else, or if I just change the resolution it black screens.

No overclock is set at startup, it just fugged it up! How can I fix this?

Going to try re-installing drivers.
 
YESSS!!! Finally got my comp up and running. :D

I've got the app goin' right now.....this might have been answered but.....how long should I let it look for artifacts on a certain clock before it can be deemed ok? Will it tell me if it passes or is it up to me to decide?
 
Originally posted by cornelious0_0
YESSS!!! Finally got my comp up and running. :D

I've got the app goin' right now.....this might have been answered but.....how long should I let it look for artifacts on a certain clock before it can be deemed ok? Will it tell me if it passes or is it up to me to decide?

good question, it deosn't tell you that the card is passed so its up to you to decide. i tried overclocking my ram and after it'd reached a speed where there were no artifacts for five min i thought that would be stable. however when i started to overclock the core the ram started to generate errors and the gpu speed just dropped and dropped till the computer crashed.

lol, i had to read through that like five times and correct it loads before it made any sense, really should get some sleep
 
well, good new, bad news kinda thing...
I mentioned earlier that WarIII showed artifacts in what I thought to be a stable Ti4200 overclock... there was a reason for that...
damn stock heatsink just fell off.

:rolleyes:

I've been playing around with the AMD multi's in Windows, so I've been in the case in the past week, I know its been just a few days since it fell off if that.
 
I have found that in my system, if I have ATITool running with the 2D/3D profiles active, CPU usage spikes when opening apps and sometimes the open will outright fail, as though the app went into a blackhole. I hope W1zzard can get this fixed because it is one kick butt feature!
 
Originally posted by jmcmike
I have found that in my system, if I have ATITool running with the 2D/3D profiles active, CPU usage spikes when opening apps and sometimes the open will outright fail, as though the app went into a blackhole. I hope W1zzard can get this fixed because it is one kick butt feature!

you using the latest version? i thought i had fixed all of these problems there :(
can you reproduce it?
 
Originally posted by W1zzard
you using the latest version? i thought i had fixed all of these problems there :(
can you reproduce it?
Using v0.0.12

It is reproducable in the sense that it recurred over several reboots until I realized what was going on. I had task manager up when I went to launch something and I saw the CPU spike for a bit then noticed it was ATITool. The app went into a black hole and I connected the dots.

FYI, this is WinXP pro w/ all available SP/hotfixes running on Intel w/ an Intel chipset mobo. Processor and RAM, while overclocked, passes Prime95 and Memtest86 without error.

Feel free to PM me if you want more info, I'll be glad to help if I can.
 
uploaded version 0.0.13 to the server...
changes:
- lots of bugfixes
- added buttons for + and - on core and mem
- artifact scanning will now work on systems without ati card
- load on windows startup improved .. no more popping up windows
- increased delay after which clock speeds will be increase in find max. mode
- "unlimited" overclocking .. if you hit the maximum on a slider it will give you another 100 mhz
 
getting better and better, keep it up w1zzard

ok this is a picky little thing, when i open the 3d window the fps is really low for about a second, roughly 70fps then it shoots up to 200+, this in itself isnt a problem but it does effect the average fps for quite a while which is a bit annoying. if there was a slight delay before the average fps started (just over a second) then the low fps that occur when opening the window won't have any effect.

like i said, bit picky. hope that made sense
 
Originally posted by fr33ze
getting better and better, keep it up w1zzard

ok this is a picky little thing, when i open the 3d window the fps is really low for about a second, roughly 70fps then it shoots up to 200+, this in itself isnt a problem but it does effect the average fps for quite a while which is a bit annoying. if there was a slight delay before the average fps started (just over a second) then the low fps that occur when opening the window won't have any effect.

like i said, bit picky. hope that made sense

please be as picky as possible :) fixed in next release
 
Originally posted by W1zzard
uploaded version 0.0.13 to the server...
changes:
- lots of bugfixes
- added buttons for + and - on core and mem
- artifact scanning will now work on systems without ati card
- load on windows startup improved .. no more popping up windows
- increased delay after which clock speeds will be increase in find max. mode
- "unlimited" overclocking .. if you hit the maximum on a slider it will give you another 100 mhz

awsome dude, getting better and better. I still don't have the patience to sit here and wait for the "find max" to find what it's looking for. ;) I will admit that it is working spectacular though and there isn't anything out there that can touch this proggy in terms of functionality and ease of use.

I can only imagine what the final release will be like. :D
 
the renderer works fine on my Ti4200 now. As soon as I have a spare moment I'll check the artifact detector.

I'll let you know how it all works on my brother's Radeon 9000 as well.
 
Originally posted by W1zzard
uploaded version 0.0.13 to the server...
And what fantastic software it is, W1zzard, thank-you for creating it! Preliminary results have the core up 48.17Mhz (19.27%) & memory up 44.17Mhz (22.1%) from stock on a 9100 128Mb. This makes the card a bit more bearable until recovery from Christmas brings a new card.

Notes:

- Error: Loading the XP Command Prompt, or Calc (from the Run prompt) caused...
Error in 3D-Detection:
Kernel32.dll has not been loaded after 5 seconds. Please submit a bugreport and include which application you just started.
Also had a GPF on it previously, but no info available. Probably me multi-tasking anyway.

- Suggestion: Comments have been made about the speed of the Max Core/Mem detection. Perhaps a good happy medium would be a simple checkbox for "Fast Test", where the increments are timed much closer together for those impatient. When unchecked, it tests on the same scale it does now, or perhaps even a little slower.

Thanks once again, respect to a fellow programmer.
 
welcome to the forums subwolf.

w1zzard have you got any advice on how long the artifact detection should run for before it is considered stable. You might wanna include a message somewhere cuz i can see some n00bs just waiting and waiting for something to happen when its detecting max speeds.
 
Originally posted by fr33ze
welcome to the forums subwolf.

w1zzard have you got any advice on how long the artifact detection should run for before it is considered stable. You might wanna include a message somewhere cuz i can see some n00bs just waiting and waiting for something to happen when its detecting max speeds.

heh, shoosh, I almost did. ;)

I've been letting it go for 5 minutes at any given clock speed before I consider it "stable" and artifact free.

I absolutely love this little app but I think that it would be kinda cool to have a "fast test" that you could run, even if the results weren't AS accurate. Some of us are VERY impatient damnit. ;) :p
 
Originally posted by cornelious0_0
heh, shoosh, I almost did. ;)

I've been letting it go for 5 minutes at any given clock speed before I consider it "stable" and artifact free.

I absolutely love this little app but I think that it would be kinda cool to have a "fast test" that you could run, even if the results weren't AS accurate. Some of us are VERY impatient damnit. ;) :p

I disagree. That would make the tool less reliable. If you want a fast test, run Aquamark and 3dmark nature test and check if you see artifacts. If you want a thorough test, let Wizzard's tool find the speed at which the card runs stable for five minutes.
 
For a quick tool to test for artifacts (which can also be made to be not so quick so it's more accurate) try the ArtifactTester program. The current version is 5 I believe, but I haven't been able to find it's homepage yet. You can change the intensity of the test so you can start from a quick little test that runs in about 2 seconds or you can go as high as something that's more like 5 minutes. The program automatically detects artifacts and will give you a "score" (which is a silly term since normally score implies that the higher the better, but this is like golf, you want lower -- except in golf you can't get a 0.) d-:

Anyway, nice idea with the program. I was surprised the drivers didn't have that built in or anything. I misunderstood something someone said before and I thought that the 9600 Pro was supposed to be overclockable right out of the box. Anyway, I'm using the RadLinker program right now and am pretty much satisfied with it for the moment. Basically, just duplicate everything they have (especially I like how it shows accurate numbers and you can control that is used when changing the values) and I will be happy. I also like how it can set the clock values without having to set a program to start up every time (actually, I think it changes something where it stays that setting even when booting, but I'm not sure really.) In the meantime, I'll just bookmark your site and keep an eye on this thread until your software is a little less beta.

BTW, why does it say that the 9600 is locked against overclocking and one shouldn't use the original catalyst drivers? I haven't used anything else yet I never had trouble overclocking. Does this just mean for the non-pro and maybe that was what I misunderstood before? But, if so, it shouldn't tell me this when I first run it.
 
i agree with you that gpu "stability" is not as important as cpu stability .. question is: are artifacts that you can't see considered artifacts? the problem is here where to draw the line ..
the comparison algorithm doesnt really know when you start noticing the artifacts .. different people might notice them at different times ..

i'll experiment with code that takes into account the color difference from what the pixel is to what it should be .. [/B]

hey wizzard !!!

you have the chance to write a great and helpful tool !

Your questions: Don't reduce the whole overclocking to artifacts !!! It's more complex than only artifacts !
The problem with OC already comes up when the shader code in a GPU calculates wrong - the same as in prime95 for CPUs. because of a too high overclock !

I just d/l your prog (i got a 9800NP->Pro btw)...but i can already say it might not be the best solution if you only 'look' for artifacts and compare them to their color values etc...i might err...but...

I personally would go the same route as prim95....eg. i do heave floating point caclulations in a shader.....and compare the results to the results in a table !!! If the results are wrong then the OC is too high - no matter if there's artifacts or not.

That's the reason why programs like rthdribl and other progs who use heavy math/FP calculations are much more sensitive than programs who do easy integer math and simpler stuff....

Don't reduce your proggy to the visible artifacts..what use has no artifacts but the code in the shader goes crazy because the OC ?


greetings
 
Originally posted by Nazo
BTW, why does it say that the 9600 is locked against overclocking and one shouldn't use the original catalyst drivers? I haven't used anything else yet I never had trouble overclocking. Does this just mean for the non-pro and maybe that was what I misunderstood before? But, if so, it shouldn't tell me this when I first run it.

you running on a 9600 pro? could you find out what your device id is? it's somewhere under display settings - advanced - options
 
Just tried the Overclocker on my overclocked 9700pro- looks neat.
Did encounter some probs though. I have my memory at 351MHz- the oC tool raises from that- starts getting erros around 359Mhz which is about right, it then starts decreasing the OC but does not manage to clear the errors even when its back at 351Mhz.
On the core I let it go from 381Mhz up to 390 MHz and it still was not showing artifacts, but I do get artifacts at that speed in trolls lair (3DMark03).
So looks like it could be useful if the above probs could be ironed out.
 
I think the errors don't clear due to the overheating causing some bugs until it cools off, which takes a few moments. Perhaps this would be solved by lowering the clock to the BIOS default, pausing for a few moments, then jumping back to the "last known good" clock. Just an idea anyway. It would make the whole process take longer though I suppose.

BTW, I don't know much about the normal clocks for the 9700, but my 9600 had a core at 398 by default and I've found I can stably get it all the way up to 521 or so, so you might want to check that pretty thoroughly just to be sure. Mind you, the 9700 has a pretty different chip, so it may be intended to be that low.

Oh, and I get some strange artifacts (a bunch of dots, I guess shaders gone mad) in 3dMark03 that appear in nothing else. Primarily in the flight thing rather than the trolls lair, but I still wonder why I don't notice anything like that in anything else.
 
If people are looking for a nice easy way to make sure the results are as accurate as possible try this out. I'll leave rthdribl running in fullscreen overnight for about 7 hours, wake up, make sure the clocks are set to default, and let W1z's program go. I think a lot of people might be forgetting to let the card warm up first.....very important.

This thing is finally to the point where it is "5 minutes stable" at about the exact same clocks where I had manually found my limit. :D Problem is that I just got an artifact yesterday in NFS Underground. I guess I'll let ATI Tool test a clock for 10 minutes before I decide the new speed is fine.
 
Originally posted by flexy123
Don't reduce your proggy to the visible artifacts..what use has no artifacts but the code in the shader goes crazy because the OC ?
That is what alot of us have been saying the entire time. Artifacts you can't see are still errors. They will show up in some application, somewhere, sometime.
 
version 0.0.14

- by popular request: new renderer with some ultra-artifact-picky pixel shader 2.0 code .. hope you're not disappointed by your max core OCs now :)
- rewritten more than half of the 3d app detection code .. it's no longer crashing other apps .. at least i couldnt find any ..

can someone check what happens on a non-ps 2.0 card plz?
 
Originally posted by W1zzard
version 0.0.14

- by popular request: new renderer with some ultra-artifact-picky pixel shader 2.0 code .. hope you're not disappointed by your max core OCs now :)
- rewritten more than half of the 3d app detection code .. it's no longer crashing other apps .. at least i couldnt find any ..

can someone check what happens on a non-ps 2.0 card plz?

OH MY GOD.....I love you!!! :D

/bow

I really couldn't care less if my clocks are lower using this version (haven't actually used it yet).....the pickier the better IMO, that way you know there's even less of a chance of ever encountering anything in a game.

Thx again for perfecting this app even more and making me even lazier. ;) Just what I needed. :rolleyes:
 
my cpu usage stays at 100% while the render window is open. cpu usage goes back to normal once atitool isn't on top and another window is. don't know if this is normal......and when i have atitool open and i click the close 3d veiw tab my cpu usage goes back to normal also.......it's seems the render window is causing 100% cpu usage.......if this is known, disregard my post......if not, hope this helps......my system specs are in sig
 
thanks a ton wizz...great proggy, nailed my clocks exactly where I found them (well, 2mhz higher on the core, but it games there at no problem!). Great work as always.
 
Originally posted by Smititty
my cpu usage stays at 100% while the render window is open. cpu usage goes back to normal once atitool isn't on top and another window is. don't know if this is normal......and when i have atitool open and i click the close 3d veiw tab my cpu usage goes back to normal also.......it's seems the render window is causing 100% cpu usage.......if this is known, disregard my post......if not, hope this helps......my system specs are in sig

thats by design .. the 3d view renders without frame limit to stress the gpu as much as possible .. this results in some (lot) cpu load .. i'll see if this can be optimized but dont bet on it
 
The 0.14 version gives the error message "No compatible devices found. At least Vertex Shader 1.1 is required." I've run the "sc delete atitool" command before opening this latest version. What's wrong?
 
Much better with the new version 0.14.
The memory overclock now clears the errors once its passed the peak and settles on a value which is the same as I find using using 3DMark03 at 357Mhz. the Core clocker is also damn near perfect- I t gave me a stable value at 393Mhz whereas the highest value I can get to without any "snow" in trolls lair is 387Mhz.
All on my HS modded VGPU modded (only at 1.61V at the moment) 9700Pro.
Congrats- very useful clever prog.

One query- I remember someone stating that the increments available for overclocking were something like 5 MHz at a time- never did understand if this was a rounding /increment error of the overclocker software or an intrinsic part of the clock controls on the Vid card itself.
 
Back
Top