ATI Powerplay Bug [Discussion Merge]

Americas Army and Crysis and UT3 ...are not as good with 8.3...Crysis looks great but stuttered and was worse than 8.2 .At least with 8.2 i could play nice even with the 1.1Crysis Patch, some how 8.3 affected my CPU performance and the Videocard..

If i use RivaTuner to monitor in any game it will show the dreaded switching...UNLESS i USE MAX SETTINGS to STRESS the CARD or force 3Dclocks AMDgputool it will constanly swicth 2d/3d .They should let it stay high clocks in 3D mode with high fan when a game starts regardless..Its a performance card and i dont care if the fan is High as long as it dosnt Overheat or go into 2d mode while gaming! If i force Default 3Dclocks it runs way better and with 100%fan.

I had to go back to the 8.2cat driver which gave me a more stable system and performance on XPx64bit.My CPU no longer takes a performance hit and im not talking about 3Dmark06...The 8.3 affected my whole system in performance on any game..Even with a clean install(and no sound card)

Note: It was even buggy installing and removing,had to restart PC several times..Once i had 8.2 installed all my problems were better..The card still swiched modes alot but was playable. 8.3 stuutters and freezes and seemed to make the system respond slower..

Will be installing Vista32bit or 64bit tomorrw to see if that helps any and see how Dx10 runs in Crysis..


9600BE
TA770
Dominator 8500x2gig
WD160gig Sata 3g
Viewsonic 2025wm
XFigamer
XPproX64bit
 
E8400
HIS 3870X2
BX38BT 4GB DDR3 all stock
Vista x64 SP1

I've noticed the exact opposite with the 8.3s, powerplay problems in Crysis 1.2 are no where near what is happening to you above, when it does happen only my GPU freqs will drop momentarily, not the memory as well. May I ask what brand you are using? In any case I have submitted tickets to both ATI and HIS, no reply from HIS but ATI is giving me the automated crap responses still. Also fired off feedback to there driver team. From my point of view it seems they are aware of the problem, but have no idea how to fix it without forcing vendors to release a new bios for every 3 series card. Just what it seems to me. I'm still happy with performance overall, Crysis seems to be the worst offender of the powerplay issue from the games I have been playing, GRAW2, Bioshock, Crysys, WoW, C&C3. But like I said I am liking the new drivers and 1.2 Crysis patch myself, performance has increased substancially in 64bit mode for me, 1920x1200 all medium high shaders is very playable for myself at the moment.

Here are a couple of Rivatuner shots from HL2 EP2 and Crysis dx10, 64bit 1.2.

Crysis



HL2 EP2
 
I'm using a Visiontek 3870 and 3870 X2. Crysis is exhibiting this behavoir.

I'm curious though about what the prolem REALLY is. Is the problem a driver issue that bottlenecks the cards, causing performance to drop, then the cards downclock themselves?

Or is it an issue where the cards downclocking themselves causes the drop in performance?
 
While searching for this issue, I found a thread where this guy says it better than I can.
I'm not quite convinced this is an accurate characterization of what's happening. Think for a minute about some details of the explanation of PowerPlay's functionality. The embedded controller makes its decisions based on an evaluation of the contents of the command buffer. That suggests to me that if the GPU is clocking down, it's because the command buffer is either empty or nearly empty. I can easily envision a case where the GPU has completed a frame and vsync is enabled, causing the queue of pending operations to zero out long enough for the card to decide to downclock. Also, cases where data must be loaded or retrieved from the pagefile could easily trigger a downclock.

What I'm saying here is that you're making the assumption that Powerplay is triggeringing hitches. But what if you have your causality mixed up and it's the hitches that are triggering Powerplay?

That doesn't mean the issue isn't ATI's responsibility. There could be a driver bug leading to performance issues (like the framerate problem with cinematic playback in Crysis with pre-7.12 Catalyst), but that doesn't mean Powerplay is the cause.
http://forums.techpowerup.com/showthread.php?p=616941#post616941
 
That's entirely possible as well. I'm still waiting to hear something official from ATI about the whole thing. Even in Crysis, my GPU load is not 100% on the core I am monitoring. I don't think that's right.
+1.
 
I've seen this happening in Guild Wars since the beginning of Day 1. This causes intense studdering when I'm going to another application (like Ventrilo or IE) and lasts for 3-5 seconds. The GPU jumps from 775 (I keep at stock) down to 297.

In Stranglehold (UT3 engine), it is less noticable for me but I do not have any AA turned on in the CCC control panel or change anything in Rivatuner to stress it further.
 
This causes intense studdering when I'm going to another application (like Ventrilo or IE) and lasts for 3-5 seconds. The GPU jumps from 775 (I keep at stock) down to 297.

I thought this was a problem. I would put to you that while your playing a game you shouldn't dick around with other programs where you have to leave the game?

If that is the problem everyone is reporting, then its been happening for years....
 
No, mine happens in all games in full screen mode. it should be in the high 3d clocks, but it switches to the low-power 3d clocks.
 
Another couple things I have noticed:

-In RivaTuner, I have the ability to choose from 2 items in the "Target Adapter" list. the first one says it's the 3870X2 [Generic PnP Monitor], which would be the bottom card in my system. This one displays an image just fine, so I have my lcd hooked up to it. The 2nd item in the target adapter list is HD3870X2 [Default Monitor] which is the top card in my system. But, when I hook my DVI up to that card, I get no signal.

-The Top card's fan speeds can be changed, I have it running at 40%. The bottom card's can't. I change it in RivaTuner, but it doesn't change on the card itself.

-In the CCC Overdrive panel, I have 4 entries. The first is HD3870X2 [AL2423W] <---my monitor model number. The other 3 just say HD3870X2. The first entry always has the highest idle temp @ about 60. The other 3 entries idle @ about 41-44, which corresponds to the "Default Monitor" in Riva.

-When monitoring games, I'm looking at the "default monitor" entry in Riva, and it is showing the low GPU usage and the down-clocking.

Again, just trying to get as much info out as possible on this, and receive as much feedback as possible too.
 
I noticed this with 8.3 on Windows XP Pro.

But ever since I upgraded to Vista 64. I have not noticed that.
I'll check again.
 
I thought this was a problem. I would put to you that while your playing a game you shouldn't dick around with other programs where you have to leave the game?

If that is the problem everyone is reporting, then its been happening for years....

I never have this happen on any of my other cards or systems. My previous card was a 7950GT which ran flawlessly but I retired it for the current games.

On the other machines:
I have a X1600 PRO (low settings though) and Guild Wars with other progs running with it doesn't "bog" it down. I even tried it on a internet surfing PC (with E4500 and 8400GS) and it doesn't happen either.
 
Does having atitool open in the background alleviate this problem? If IIRC, atitool can force 3d clocks when it's open.
 
As far as I know, atitool doesn't play well with X2's in crossfire-x. I may be wrong though, but last time I tried it out I couldn't get it working at all.
 
Pestilence - you seem pretty knowledgeable here, have you taken a look at some of the test fixes? I'm definitely not savvy enough to try these, I'm still working on getting CCC 8.3 working :]
http://www.sapphiretech.com/en/forums/showthread.php?t=18798

Looks interesting. I'll definitely try it when I get home from work today. only other problem being that the fan profile on my cards is a bit funky too I think. I can only control one card using RivaTuner. the other card gets pretty heated up during games.

On another note, I talked to Kyle, and he said he has got some ATI eyes on this subject and they are looking into it. There apparently is a real issue here and it's being worked on, just no details at the moment.
 
This problem is noticable in F.E.A.R. - and even more persistent at lower resolutions,
I Have a 3850
 
Interesting, but promising results guys.

I tried the method outlined here - http://www.sapphiretech.com/en/forums/showthread.php?t=18798 and changed all my powerplay settings to 855/955 using the XML file generated from the Overdrive profile I created. Here is the result:

Crysis: Stutterring is nearly completely gone! I was able to run 1920x1200 with everything on high getting about 30-45 fps. only stuttering occurred at checkpoints. Now, there was some slowdown to about 25fps when streaming in some new textures every once and a while, but it was completely playable.


[EDIT] Crysis @ 1920x1200 on VERY HIGH: Loaded up the first level. Got through the opening cut scene up until you are just about to jump out of the plane, and I had a driver crash. Went and looked at rivatuner and my temp had spiked to 100 degrees and shut down. Looks like the fan didn't kick in at all. Well, to clarify, it stayed at 22% fan duty cycle. This always happens with the bottom card. The fan profile is not working correctly on this card. The top card in the system however can stay at 40% fan duty cycle, forced through rivatuner and it stays just fine.

[EDIT2] Crysis @ 1920x1200 on VERY HIGH: Loaded up the first level WITHOUT play the cinematic. Game played just fine. Ran about 25-30 fps the entire time. Temp never went above 70 degrees this time. gpu activity between 70-90%.


HL2 Ep2: Was able to play the "This Vortal Coil" level just fine. 60fps constant, 1920x1200 8xaa 16xaf. But when I loaded up the scene in "Return to White Forest" where you talk to Dr. Kliener on the video monitor, it was pegged at 30fps. Before this modification to the driver, it was pegged at 18fps. I went out and looked at RivaTuner and it was pegged at 30% gpu activity on the core I was monitoring. It would seem either this is a little bug in the game, or the driver.


CoD4: Loaded up the Bog @ 1920x1200 4xaa 16xaf, got about 40 seconds in, running 60fps constant, and got a driver crash. But, within about 10 seconds, it did a recovery and the game was running again. Finished out the level without dipping below 60fps. gpu activity on the core i was monitoring was about 35-60%

[EDIT] CoD4: Tried to repeat what happened above. This time I got much further in before the crash, but it still did an in-game recovery and was fine after that.


Bioshock: 1920x1200, everything maxed out. 60fps solid. Didn't dip once.
 
Well, after all this, it would seem that if ATI are aware of the issue, and are working on a fix, we're in good shape. If I could edit a simple xml file and get these types of results, compared to what was happening, I'm sure ATI can truly fix the issue in a hotfix, or the next driver revision, allowing powerplay to be left on, but function correctly in 3d apps.
 
Sounds like you have an overheating issue Pestilence (you mentioned the fan issue). I'm guessing that's what's causing your instability.
I noticed there are some fan control options in the XML profile file. I wonder if we could manually tweak the fan speeds from there also.
 
Hm. When I click "Activate and Close", the CCC tab still shows the low 2d clock speeds (300mhz GPU) in the current clock settings box.
 
Sounds like you have an overheating issue Pestilence (you mentioned the fan issue). I'm guessing that's what's causing your instability.
I noticed there are some fan control options in the XML profile file. I wonder if we could manually tweak the fan speeds from there also.

Well, it only happened once, but the bottom card definitely has something wrong with the fan profile. It was changed in the XML from automatic to manual and set to 40%, just like the top card, it just doesn't affect it though. the top card was edited to 40% and it actually changed it.

Also Jebo, make sure in the XML you change EVERY instance of "coreclocktarget" and "memoryclocktarget". Here is what my XML looks like:

<Profile>
<Caste name="Graphics">
<Groups>
<Group name="Overdrive5">
<Feature name="CoreClockTarget_0">
<Property name="Want_0" value="85500" />
<Property name="Want_1" value="85500" />
<Property name="Want_2" value="85500" />
</Feature>
<Feature name="MemoryClockTarget_0">
<Property name="Want_0" value="95500" />
<Property name="Want_1" value="95500" />
<Property name="Want_2" value="95500" />
</Feature>
<Feature name="CoreVoltageTarget_0">
<Property name="Want_0" value="1244" />
<Property name="Want_1" value="1244" />
<Property name="Want_2" value="1304" />
</Feature>
<Feature name="MemoryVoltageTarget_0">
<Property name="Want_0" value="0" />
<Property name="Want_1" value="0" />
<Property name="Want_2" value="0" />
</Feature>
<Feature name="FanSpeedProtocol_0">
<Property name="FanSpeedProtocolProperty" value="Percent" />
</Feature>
<Feature name="FanSpeedAlgorithm_0">
<Property name="FanSpeedAlgorithm" value="Manual" />
</Feature>
<Feature name="FanSpeedRPMTarget_0">
<Property name="Want" value="0" />
</Feature>
<Feature name="FanSpeedPercentTarget_0">
<Property name="Want" value="40" />
</Feature>
<Feature name="CoreClockTarget_1">
<Property name="Want_0" value="85500" />
<Property name="Want_1" value="85500" />
<Property name="Want_2" value="85500" />
</Feature>
<Feature name="MemoryClockTarget_1">
<Property name="Want_0" value="95500" />
<Property name="Want_1" value="95500" />
<Property name="Want_2" value="95500" />
</Feature>
<Feature name="CoreVoltageTarget_1">
<Property name="Want_0" value="1244" />
<Property name="Want_1" value="1244" />
<Property name="Want_2" value="1304" />
</Feature>
<Feature name="MemoryVoltageTarget_1">
<Property name="Want_0" value="0" />
<Property name="Want_1" value="0" />
<Property name="Want_2" value="0" />
</Feature>
<Feature name="FanSpeedProtocol_1">
<Property name="FanSpeedProtocolProperty" value="Percent" />
</Feature>
<Feature name="FanSpeedAlgorithm_1">
<Property name="FanSpeedAlgorithm" value="Manual" />
</Feature>
<Feature name="FanSpeedRPMTarget_1">
<Property name="Want" value="0" />
</Feature>
<Feature name="FanSpeedPercentTarget_1">
<Property name="Want" value="40" />
</Feature>
<Feature name="CoreClockTarget_2">
<Property name="Want_0" value="85500" />
<Property name="Want_1" value="85500" />
<Property name="Want_2" value="85500" />
</Feature>
<Feature name="MemoryClockTarget_2">
<Property name="Want_0" value="95500" />
<Property name="Want_1" value="95500" />
<Property name="Want_2" value="95500" />
</Feature>
<Feature name="CoreVoltageTarget_2">
<Property name="Want_0" value="1244" />
<Property name="Want_1" value="1244" />
<Property name="Want_2" value="1304" />
</Feature>
<Feature name="MemoryVoltageTarget_2">
<Property name="Want_0" value="0" />
<Property name="Want_1" value="0" />
<Property name="Want_2" value="0" />
</Feature>
<Feature name="FanSpeedProtocol_2">
<Property name="FanSpeedProtocolProperty" value="Percent" />
</Feature>
<Feature name="FanSpeedAlgorithm_2">
<Property name="FanSpeedAlgorithm" value="Manual" />
</Feature>
<Feature name="FanSpeedRPMTarget_2">
<Property name="Want" value="0" />
</Feature>
<Feature name="FanSpeedPercentTarget_2">
<Property name="Want" value="40" />
</Feature>
<Feature name="CoreClockTarget_3">
<Property name="Want_0" value="85500" />
<Property name="Want_1" value="85500" />
<Property name="Want_2" value="85500" />
</Feature>
<Feature name="MemoryClockTarget_3">
<Property name="Want_0" value="95500" />
<Property name="Want_1" value="95500" />
<Property name="Want_2" value="95500" />
</Feature>
<Feature name="CoreVoltageTarget_3">
<Property name="Want_0" value="1244" />
<Property name="Want_1" value="1244" />
<Property name="Want_2" value="1304" />
</Feature>
<Feature name="MemoryVoltageTarget_3">
<Property name="Want_0" value="0" />
<Property name="Want_1" value="0" />
<Property name="Want_2" value="0" />
</Feature>
<Feature name="FanSpeedProtocol_3">
<Property name="FanSpeedProtocolProperty" value="Percent" />
</Feature>
<Feature name="FanSpeedAlgorithm_3">
<Property name="FanSpeedAlgorithm" value="Manual" />
</Feature>
<Feature name="FanSpeedRPMTarget_3">
<Property name="Want" value="0" />
</Feature>
<Feature name="FanSpeedPercentTarget_3">
<Property name="Want" value="40" />
</Feature>
</Group>
</Groups>
<Adapter name="PCI_VEN_1002&amp;DEV_950F&amp;SUBSYS_02441043&amp;REV_00_6&amp;3B4CB29&amp;0&amp;00600030A">
<Aspect name="Overdrive5" />
</Adapter>
</Caste>
</Profile>



This is because I have 4 cores in my system. Unfortunately, regarding the fan speed control, I can't tell which cores control the fans anyway, I think only 2 of them do. granted, even with all 4 set to manual and 40%, only one card is actually changing. So i think the bios may be messed up in my bottom card.
 
Pestilence, your thread intrigued me. I loaded up Crysis and RivaTuner and found the same thing, but for some reason mine "straighten's out" after about a minute or two.

Can you do me a huge favor? Add these two lines to your Crysis autoexec.cfg

e_precache_level=1
r_TexturesStreaming=0

I'm really curious to hear if that does anything for you. If you're unfamiliar with Crysis config files, just make the autoexec.cfg in the root Crysis directory with those two lines in it. Your Crysis will switch from showing "HIGH SPEC" to "CUSTOM" but your IQ won't change.
 
I had tried both of those way before i made the mod to this xml. Here is what happened:

r_TexturesStreaming=0 had no effect on the hitching, and I couldn't quite tell the difference with it on and off, because the game was stuttering so bad to begin with.

e_precachelevel=1 was VERY annoying. it would take about 10 minutes to load a level, and it didn't help performance at all.

When i get home today I'll try with both of those on with the modded xml and see if it makes any difference, although i'm quite happy with Crysis performance right now.
 
Went and looked at rivatuner and my temp had spiked to 100 degrees and shut down. Looks like the fan didn't kick in at all. Well, to clarify, it stayed at 22% fan duty cycle. This always happens with the bottom card. The fan profile is not working correctly on this card. The top card in the system however can stay at 40% fan duty cycle, forced through rivatuner and it stays just fine.

Glad to see your progress...
Is there any concensus on what safe idle temps are? I bumped my fan up to 40% and found it was a bit loud for super quiet system. I dropped it down to 35% and am at 52-53 idle (versus 42 @ 40% and 70 @ auto) :rolleyes:
 
Well, when I had only one X2 in the system, and before RivaTuner even worked with the card, it was idling at like 60-65 degrees. I ran like that for a month without any problems. now with the fans revved a bit, it idles around 44-51, depending on which core you are looking at. the cores near the back of the card are always going to be hotter, that's unavoidable.

But on load, I don't see them going past 70-80. Which I think for a dual gpu with a single cooling solution, is pretty awesome.

Again though, the 40% fan only works on my top card, so the default profile on the bottom card seems to be functioning and doing it's job, although it's a bit hotter than I'd like it to be. Waiting on an official fix from ATI on that.
 
Hmmm...know what I just realized? I didn't change the voltage on the 2 lower power modes inside the XML...maybe that's what was causing the driver crash? I'll have to edit that when I get home and see what happens.
 
Hmmm...know what I just realized? I didn't change the voltage on the 2 lower power modes inside the XML...maybe that's what was causing the driver crash? I'll have to edit that when I get home and see what happens.
I noticed that also. I did change my voltages to the highest value for all 3 modes. That could cause a problem because in the game, it could be dropping to 2d clocks, which you have forced to the high values, but the voltages would be dropping to the low values. So you'd be running at high clocks on low volts.
 
Got a message back from ATI about my ticket. They are aware of the issue and want to escalate it to the lab to see if they can reproduce it. I have to send my msinfo32 to them when I get home from work. Glad to see they are going to work on it.
 
Good news, I think yours is the first response I've actually read about them given someone who has submitted tickets, nothing here yet even though the problem has pretty much disappeared on my end from what I have been monitoring.
 
Well the profile thing seems to have helped my TF2 problems and probably most of the other hitching I've experienced (have to test it first of course). I tried creating a second profile to return everything to "normal", but apparently once the fan speed gets kicked up the second profile won't set it back. Maybe I'll try it again after a reboot.
 
Yea, not sure why, but the fan profile settings kick in right when windows boots up, I don't even have to activate the custom profile in the CCC. Must change maybe a registry setting somewhere at some point? Don't really know.

But yea, it has helped me out in my gaming tremendously. Now i'm just waiting on an official fix so I can return everything to the defaults.
 
Well I made a second profile and set it before rebooting and it does set the fan speed/clocks back when it comes back up..

I did notice one little quirk with the profiles. The first time I activate the custom profile, it speeds up the fan. The second time I attempt to activate it, it changes the clockspeed. So if anyone has tried this and its not working for them. Try setting it twice and watching the ATI Overdrive page to see your clocks change.
 
The clock speed changes seem to be totally hit or miss for me.
Clean vista 32b install.
 
I just did this trick earlier today. Rebooted and now the CCC host has stopped working. I've tried uninstalling and re-installing the 8.3 suite multiple times but I get the same error msg everytime I try to start up the CCC. Has anyone run into this problem?
 
Back
Top