GPU "Microstuttering" FAQ

Cool post, thanks.

I was a previous owner of a 9800gx2 and I myself saw some microstuttering in SupCom, WiC and Crysis.. I'm never going to do SLI/CF ever again until that gets fixed.

I am however, very eager to see the debut of the 4870x2 to see whether the whole two cores on one die will solve that problem for them.. if so, sign me up! :)
 
I was going to get SLI, i would have to deal with a crappy nvidia mobo that was "fine", but after finding out about this i gave up.
 
That chart is deceptive. No video card renders at the exactly the same rate the whole time. Just look at a fraps chart. It'll be all over the place even on a single card. So microstuttering isn't just a multi-GPU issue. If a single GPU renders frame A in 20ms, Frame B in 22ms, frame C in 18ms ... would you not see the same effect?

Principle of superposition, the 2 "waves" or graphs add to each other at any time, which means the difference between max and min (the peaks and troughs) can be that much greater.

I was considering SLI or Xfire for my next rig and have been looking at whether it's financially a good move and it actually appears so at first for example SLI 9800GTX or Xfire 4850 cards are cheaper than a single GTX280. But when you consider:

Cost of a SLI/Xfire capable motherboard
Cost of a SLI/Xfire capable PSU
Not perfect scaling (in fact quite bad for Xfire in some reviews I've seen)

I mean the cost of the additional kit and possibly extra cooling is enough to almost cover the cost gap between SLI/Xfire and a single expensive card, big PSU's can cost the same as 4850 to begin with!

Add to that the fact that it doesn't always scale that well, in some games it's ok, but it being inconsistant is also bad, you never know if you buy a new game if the scaling is going to be pants or not.

So that kind of left me on the fence, but reading [H]'s reviews about pushing more frames on SLI/Xfire to keep games "smooth", and the recent conversation about microstutter, that's enough to put me off multigpu and stick witht he single most powerful single gpu, which atm is the GTX280.
 
I really had thought this microstuttering issue would have been fixed by now. I noticed it a couple of years (or more) ago when I first tried SLI with 2 7800GTXs. I ended up selling one of them because I couldn't cope with the stuttering.

A few days ago I bought a 280 although I dithered for a bit ... should I wait for the 4870x2? or get two 4850s? or even two 4870s but after seeing this thread and others I realised I would kick myself if I went with anything other than a single GPU solution. For me it isn't just microstutters (although they are a big factor), you get much more noise, more heat and of course the additional power requirements. For me.. multi GPU is just not worth the hassle.
 
I noticed the microstuttering with my 9800GX2 also but thought it was just loading stutters..Makes much more sense now..
 
Thanks for this thread, the only thing that bugs me more than microstuttering is the fact that review sites never ever mention it in their articles. The issue won't get resolved until there is more exposure and pressure on amd/nvidia to find a solution. In fact review sites are openly advocating cheaper multi-gpu solutions in favor of expensive single cards, which is real BS imo. I'll be in line to get my sli/crossfire once microstuttering is a thing of the past, but not a moment sooner.
 
If I notice microstuttering, can I minimize/eliminate it?
Yes. By running the game at a setting where your graphics cards are able to output more than the monitors refresh rate (that is, the maximum FPS the monitors are capible of; the pixels on your screen can only change so fast) microstuttering is eliminated completely. Most monitors have a refresh rate of 60 or 70Hz, meaning you would need 70 or 80 FPS to eliminate microstuttering.


I'm wondering if this might be an argument, cost considerations aside, for what most would consider massive overkill in GPU power.
Take my situation for example. I game @ 1600 x 1200 on a CRT capable of 85 Hz.
I'm looking to upgrade from SLI'd 7900 GTX's which have always suffered from obvious hangs/microstutter. Have been lusting for a 280 while waiting for 4870 results. I've been thinking I could eventually run two 280's in SLI if games yet to come begin to tax the capabilities of one. However, perhaps SLIing a pair of them now could totally eliminate microstutter in any game now out by keeping minimum frames per second well above 85 while still allowing max in game settings. Is this correct?

Thanks for this post - very informative for me.
 
I had buyer's remorse after I got my 9800GX2. I got more fps, but I greatly missed the silky smooth feel of my 8800GTX. Thankfully, I was able to step-up to a 280GTX which will be here today. I'm optimistic that my smoothness will return. ;)
 
Thanks for making this thread, Wizard. Maybe now we'll see fewer posts confusing microstuttering and good ol' stuttering.
 
Are there any games where the style of play or motion makes this issue relatively obvious? I'm understanding the concept thanks to the helpful graph, but I'd like to get a better practical feel of what to look for.

LMAO Best Post Ever.
 
I'm wondering if this might be an argument, cost considerations aside, for what most would consider massive overkill in GPU power.
Take my situation for example. I game @ 1600 x 1200 on a CRT capable of 85 Hz.
I'm looking to upgrade from SLI'd 7900 GTX's which have always suffered from obvious hangs/microstutter. Have been lusting for a 280 while waiting for 4870 results. I've been thinking I could eventually run two 280's in SLI if games yet to come begin to tax the capabilities of one. However, perhaps SLIing a pair of them now could totally eliminate microstutter in any game now out by keeping minimum frames per second well above 85 while still allowing max in game settings. Is this correct?

Absolutly. Again I havn't been able to test this but everything I know about the issue points to it. If your graphics cards are able to render higher than your pixels rate of change (ie refresh rate), it would be impossible for microstutter to worm its way into whats displayed. And even if it could just due to the nature of DVI-D or anolog connections, when your rendering 85FPS the worst a sutter could be (9:1) is only going to have a "variance" of 10ms, which even to the keenest eye would be pretty hard to catch.

Are there any games where the style of play or motion makes this issue relatively obvious? I'm understanding the concept thanks to the helpful graph, but I'd like to get a better practical feel of what to look for.

LMAO Best Post Ever.

I donno seems like a fair question to me --and I'm sorry I didn't notice that he had asked it. Where do you notice aliasing the most? Hard lines, as in like where a dark walls edge meets the bright sky. Where do you notice Anisotropic artifacts? Wherever you get really dense textures on very few pixels.

Where would you notice microstuttering the most? I guess you could make an argument for pictures with high sharp contrast would be the worst. When a brown pixel under goes microstuttering and as such takes a long time to a slightly darker brown, and then very quickly changes to a darker mahogany, your not going to notice it as much as a pixel thats changing from black to white in the same fasion. So yeah, games with high contrast might make the problem more noticable.

That said, if you dont notice it, don't look for it, lol.
 
Holy crap. Now I'm really having second thoughts on wanting to try out CF'd HD 4870s. I have a really terrible GPU now that causes hangs and lag, but I wouldn't want to deal with that on $600 worth of GPUs. Crap... What to do?
 
What to do?
Wait for reviews and user experience to see if these issues are as pronounced in these new cards as they have been in other cases.

I'm getting the impression from the replies to this and other threads that noticeable, experience-ruining microstuttering is the exception rather than the rule with multi-GPU setups. You may be fine with these new cards, you may not. Don't worry, lots of people will run out and buy them before you do, and some of those people will be happy to test for this issue.
 
The attention you see this getting is a direct result of our real world gameplay testing.
 
Personnally I dont care much about microsuttering, it is not nearly as bad as some here make it out to be,you get stuttering with every card outhere if pushed hard enough...

I still have 30 days to decide and may step up my GX2 SSC to a GTX280 as I beleive in the long run the 280 will "truly" distance itself from the GX2 as the single card top performer.

From my personal point of view microstuttering, when it occurs, makes a game pretty unplayable. With a single GPU you can get slowdown yes but not stuttering (unless something is wrong). The youtube video posted by gerrson shows the stuttering, it doesn't look all THAT bad but it "feels" much worse when you are actually playing a game.

The problem with stuttering is that it takes the frame rate from high to very low, it feels like you mouse/keyboard/joypad has got stuck (albeit briefly). I suppose some people get over it or don't notice it but I certainly can't seem to do either.
 
this explains an awful lot. i've been running the 7900 GTO's in SLI since they came out on a 24" LCD. so yeah the resolution is quite high....i dont really mind lower framerates though for some reason, thats another story.

but i do have to say that some games i really do notice this microstuttering. oblivion i could NEVER get smooth no matter what i did! framerate was never an issue either. i noticed it a little in bioshock but its not near as bad as oblivion
 

You can go back and read through multiple reviews where we complained of this behavior even though framerates were above what is considered to deliver acceptable gameplay. Canned benchmarks / timedemos / custom timedemos / 3DMark / do not identify this behavior as our gameplay testing does.
 
I believe in MS and its performance degradation issues across the board. I do have a question though because I thought i had issues with some microstuttering but turned out, at least this time to be something else. I noticed in that youtube video, he was using a wireless device. I too had a similar problem in that game with my previous setup that would appear when using a wireless mouse. Could wireless devices add to what appears to be MS and/or some sort of wireless induced lag?

I say this because when I went back to a wired mouse, that type of anomaly disappeared.
 
I believe in MS and its performance degradation issues across the board. I do have a question though because I thought i had issues with some microstuttering but turned out, at least this time to be something else. I noticed in that youtube video, he was using a wireless device. I too had a similar problem in that game with my previous setup that would appear when using a wireless mouse. Could wireless devices add to what appears to be MS and/or some sort of wireless induced lag?

I say this because when I went back to a wired mouse, that type of anomaly disappeared.
Intermittent wireless problems could create input lag issues, and thus could further degrade the playing experience by creating additional stuttering, but microstuttering is an orthogonal issue, a byproduct of GPU rendering timing alone.
 
I use Crossfire and find that disabling Catalyst AI eliminates most of the feel of microstuttering. I suspect this is because Catalyst AI forces AFR mode in games by default. Disabling it, I can tell it uses scissor mode in most games by the way things appear when one card has an error.
 
I believe in MS and its performance degradation issues across the board. I do have a question though because I thought i had issues with some microstuttering but turned out, at least this time to be something else. I noticed in that youtube video, he was using a wireless device. I too had a similar problem in that game with my previous setup that would appear when using a wireless mouse. Could wireless devices add to what appears to be MS and/or some sort of wireless induced lag?

I say this because when I went back to a wired mouse, that type of anomaly disappeared.

A lot of the video shows stuttering when moving forward and strafing. He appears to have a logitech G11 which is wired so I suspect the stuttering is related to the graphics cards. However I do agree that a wireless mouse or keyboard which lost connectivity very briefly would give you a very similar affect.

I should point out (to other posters) that I personally don't believe that the stuttering is an exception at all. It seems more likely that people either don't notice it or run games which run at a high enough frame rate to make the stuttering non-existant or unnoticable. I have a number of friends who have complained about it over the years and dropped the idea of SLI or crossfire. I had, however, "assumed" that it would have been sorted out by now.
 
The way micro-stuttering is described I can say I've experienced it with my single 8800gtx. I can repro it in half-life 2 episode2 while looking at some burning propane flames.

The exact spot is here:http://guides.ign.com/guides/812574/page_21.html
(4th screenshot, has big white propane tank and guy on fire). When the flames are turned on and look at the flame I get stuttering like what was described here. Can some else reproduce this?
 
Generally speaking when I've noticed microstutter with my HD 3850 Crossfire rig it was moving about or panning around quickly to face an opposite direction in a game.

It is most annoying when you are moving forward and you get these regular hitches or rough surges in video update. You can call it hitching, chop, or stutter. It blows by any name.

I remember when I had a 7950 GX2 TRL with next gen content on and AA was darn near unplayable on some levels due to the hitching. The start of the Japan level was nauseating.
 
The way micro-stuttering is described I can say I've experienced it with my single 8800gtx. I can repro it in half-life 2 episode2 while looking at some burning propane flames.

The exact spot is here:http://guides.ign.com/guides/812574/page_21.html
(4th screenshot, has big white propane tank and guy on fire). When the flames are turned on and look at the flame I get stuttering like what was described here. Can some else reproduce this?

I can't say for certain but it sounds more like bad slowdown in FPS rather than stuttering. Of course I could be wrong! :) Some particle affects used to and still do cause poor FPS.
 
After reading today's review of the HD 4870, I may not even need to buy a second one for a while. Guess I'll just buy one once they come out, but I'll certainly leave the option open for a second one.
 
In my experience CF and SLI (in AFR mode) work best at high frame rates. If you are struggling in the low fps zone even with your multi GPU setup, it 'feels' worse than that equivalent fps with a single card because of the variable lag between frames. That sums up my experience with both SLI and CF, although SFR is not quite as bad (just not a very good boost quite often).
 
The way micro-stuttering is described I can say I've experienced it with my single 8800gtx. I can repro it in half-life 2 episode2 while looking at some burning propane flames.

The exact spot is here:http://guides.ign.com/guides/812574/page_21.html
(4th screenshot, has big white propane tank and guy on fire). When the flames are turned on and look at the flame I get stuttering like what was described here. Can some else reproduce this?

No, you are not getting microstuttering.
 
I use Crossfire and find that disabling Catalyst AI eliminates most of the feel of microstuttering. I suspect this is because Catalyst AI forces AFR mode in games by default. Disabling it, I can tell it uses scissor mode in most games by the way things appear when one card has an error.

Can you please elaborate what scissor mode is and what did you mean by when one card has an error? Is it better to run CF in scissor mode or will this "microstutterless" mode hurt performance?
 
Can you please elaborate what scissor mode is and what did you mean by when one card has an error? Is it better to run CF in scissor mode or will this "microstutterless" mode hurt performance?

I assume by scissor, he means SFR or split frame rendering.
Basically, each card renders half the screen, but from what I have read it's fairly problematic and does not see the same performance of AFR. (the cause of microstutter)
 
I assume by scissor, he means SFR or split frame rendering.
Basically, each card renders half the screen, but from what I have read it's fairly problematic and does not see the same performance of AFR. (the cause of microstutter)

Perhaps it is a decent compromise? How much performance loss would one expect vs AFR? I guess it is game dependent, since both cards are not being put through the same stress.

I've heard from some of the early 4850 CF owners that microstuttering is not as bad or even eliminated as claimed by some. Maybe CF is now improved vs older generations?
 
Can you please elaborate what scissor mode is and what did you mean by when one card has an error? Is it better to run CF in scissor mode or will this "microstutterless" mode hurt performance?

AFR mode is when one card renders the odd frames, and the other card renders the even frames.

Scissor mode is when each card handles one half of the rendering load for each frame. The division is not always half the screen. Scissor mode changes the screen split dynamically in order to divide the rendering load as close to 50/50 as possible. Ie; if there is big empty sky in the top half of the image, the split will be more like one card does the top 80% of the image, which means all the sky and a bit of the ground, and then the other card will do the bottom 20% of the image, which has more foreground elements with sharper textures, etc.. so ideally each card will finish rendering its portion of the image simultaneously. This dynamic allocation is not perfect and so you get a deviation, and thus microstutter (or at least if it's not technically microstutter it feels like it), in Scissor mode as well; from my own personal experience, I'd guess the deviation is smaller because scissor mode is way smoother for me in pretty much all new games (meaning ones that tax a CF setup).

I wish ATI would release drivers that would let you specify which CF mode to use. Not to mention, fan controls!
 
I see. I guess so far there's nothing we can really do against microstuttering. Hopefully with some driver revisions it can improve. I wonder if the 4850 in CF have any stuttering?
 
@christpunchersg
Anything is possible but more than 2 years down the road and nothing seems to have changed.. although some of the posts in this thread suggest it's improved with 4850s and ATI have some new tech with the 4870x2 there seems to be a fundamental problem with multi-GPU as stated by the OP in his "Why does microstuttering happen" section.

Assuming I'm understanding this issue correctly, I can't see how it can be fixed easily.. you can't just buffer frames because there isn't time to do it and if there was you would get lag. You can't predict how long the next frame will take to render because the only way to do that is to render it....
 
Any news on if the 4850/4870's still have this issue in Crossfire mode? I'm thinking about getting a 4870 X2 when they release... but I've confirmed this issue with some people in person too, and if it's not improved I doubt I'm going to bother with multi-GPU any time soon. Almost seems like a downgrade.
 
Anyone with 2 way GTX 280's see this issue? I've not in Crysis, UT3, Call of Juarez, GRID Demo this far.
 
Back
Top