GPU "Microstuttering" FAQ

MrWizard6600

Supreme [H]ardness
Joined
Jan 15, 2006
Messages
5,791
So everyones been posting about how an SLI or Crossfire rig, while managing to put out higher FPS than the single GPU solutions, does so while "Microstuttering". I'd like to clarify as much as I can around the issue, and hopefully you guys can help me out too. I don't have access to a multi GPU rig right now, so I'm going to go from what I learned about SLI and crossfire setups from my last rich customers dual 8800 build.

What is microstuttering?
When running two Graphics Processing units (GPUs) in tandem, via Crossfire or SLI, in Alternate Frame Rendering (AFR) mode, the two GPUs will produce frames asyncronously (for lack of a better term). Microstuttering can be expressed one way as your computer experiancing, in extreme rapid sucession, a high FPS, followed by a low FPS, followed by a high, then low, and so on.

The following is a chart of Frames VS Time, with time in nanoseconds (yeah, I have no idea why they used nanoseconds, but 100,000 [the first increment] is 0.1 seconds, or 100 miliseconds) in the Y-axis, and the frame count (NOT FPS) in the X-axis. The lower the bar, the higher the FPS.
Frametimes-1.jpg

Image taken from PC gamers hardware

As you can see, the crossfire setup, while getting out the same amount of frames (30) in less time (ie, while maintaining a higher FPS), has a certain wobble. This wobble is the phenomina known as microstuttering.

Here is another chart from anand found by Mega 666
microstutter2ls7.png

Pulled from Anand

The Multi-GPU "Variance" is much greater than that of the Single GPU solution. The greater the difference from one variance to the next, the greater the stutter. The difference between the average variance and the variance at any one frame is the phenomina of microstuttering.

How does microstuttering impact me in game?
Microstuttering can make playing what fraps is calling a 60fps game feel identical to playing a 30 fps game (literally, this is a potential true case mathmatically).

Why does microstuttering happen?
It's a product of the failure of a multi GPU solution to syncronize properly. Frame syncronization is the act of making sure that the time between frames is identical no matter where you take a measurement. In a single GPU solution one GPU builds the image, and then sends it off to the monitor. It then builds another and sends it off as well. Thus, a single GPU solution does not suffer from microstutter. In a dual GPU solution, two GPUs build seperate images. In Alternate Frame Rendering (AFR), GPU "A" must send its image to the monitor exactly half way between the previous frame from GPU "B", and the next frame (which will be from GPU "B" as well).

Note: 20 milliseconds is .02 seconds, or 1/50th of a second.
In my example lets go with a game running at 50 FPS. A frame is built and displayed by GPU "A". Exactly 20 miliseconds (ms) later GPU "B" must have completed building and displaying its frame. Exactly 20 ms after that GPU "A" must have finished building and displaying the next frame, and so on. Each frame must be displayed in exactly 20 ms after the previous one.

I'm currently working on a flash demo to illustrate the point. When its done I'll export it to .gif and post it here... assuming [H] supports gifs lol.

Do all Multi GPU rigs experiance microstuttering?
It looks like it, yes. The extent to which might be different on one system than on another, they might even be different every time you start your machine or run the game engine.

If I notice microstuttering, can I minimize/eliminate it?
Yes. By running the game at a setting where your graphics cards are able to output more than the monitors refresh rate (that is, the maximum FPS the monitors are capible of; the pixels on your screen can only change so fast) microstuttering is eliminated completely. Most monitors have a refresh rate of 60 or 70Hz, meaning you would need 70 or 80 FPS to eliminate microstuttering. Also, running the game in Split Frame Rendering (SFR), with the top half being rendered by one card and the bottom half being rendered by the other, will eliminate microstuttering, but opens the door to tearing and a performance hit. I don't know if SFR is even still supported...

If theres any other point anyone would like me to make, by all means post it and i'll include it.
 
ahh good stuff. now i understand why CSS occasionally looked the way it did after i had sli'd two gts g92's.
 
Are there any games where the style of play or motion makes this issue relatively obvious? I'm understanding the concept thanks to the helpful graph, but I'd like to get a better practical feel of what to look for.
 
That chart is deceptive. No video card renders at the exactly the same rate the whole time. Just look at a fraps chart. It'll be all over the place even on a single card. So microstuttering isn't just a multi-GPU issue. If a single GPU renders frame A in 20ms, Frame B in 22ms, frame C in 18ms ... would you not see the same effect?
 
That chart is deceptive. No video card renders at the exactly the same rate the whole time. Just look at a fraps chart. It'll be all over the place even on a single card. So microstuttering isn't just a multi-GPU issue. If a single GPU renders frame A in 20ms, Frame B in 22ms, frame C in 18ms ... would you not see the same effect?

I believe it would be those fluctuations that CAUSE microstuttering. If the cards rendered at a constant rate, it would be easy to offset one from the other and have a perfect rythmn between the two. The GPUs have to constantly change the offset delay between the two cards, and that's what is so complicated.
 
I believe it would be those fluctuations that CAUSE microstuttering. If the cards rendered at a constant rate, it would be easy to offset one from the other and have a perfect rythmn between the two. The GPUs have to constantly change the offset delay between the two cards, and that's what is so complicated.

What about buffering the rendered frames and feeding them out at a constant rate, sort of like how triple buffering fixes the V-sync issues.
 
That chart is deceptive. No video card renders at the exactly the same rate the whole time. Just look at a fraps chart. It'll be all over the place even on a single card. So microstuttering isn't just a multi-GPU issue. If a single GPU renders frame A in 20ms, Frame B in 22ms, frame C in 18ms ... would you not see the same effect?

that chart is over the course of 30 frames. In the course of 30 frames (less then a second... well as you can see there ~.7 secs).

I knew this would come up and I'm going to have to incorperate it somehow. What your suggesting is nearly impossible. the odds that the frame's content has changed so dramatically that the GPU requires a whole additional 4 ms of time to render would be extremely rare. A sharp drop or an increase in FPS, even as sudden as some of [H]s charts indicate, when scaled to fit this chart would be a long curve, not s spike. I'll work on that now actually...
 
This is very interesting and I will be looking forward for more results.
Acquiring a pair of 4870 seems a valid option but this kind of weakness could maybe change my mind.
 
I've heard AMD has redesigned their bridge and patched their drivers to help prevent microstutter.
 
I've heard AMD has redesigned their bridge and patched their drivers to help prevent microstutter.
Supposedly, the 4870x2 has been redesigned to mitigate microstuttering, but I'll believe it when I see it.

+1 to the OP for posting this.
 
Doing some research on this. Never experienced it myself but..

An interesting and long thread at anandtech -

http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2178762&FTVAR_STKEYWORDFRM=&STARTPAGE=1&FTVAR_FORUMVIEWTMP=Linear

Another graph which may help.

[]http://img353.imageshack.us/img353/2253/microstutter2ls7.png[]

Now that one makes more sense and shows a much more drastic variation in frame output than the original post did. So micro-stuttering still occurs on a single card, it's just so fast that no one can see it.
 
Now that one makes more sense and shows a much more drastic variation in frame output than the original post did. So micro-stuttering still occurs on a single card, it's just so fast that no one can see it.

It's the delta in latency between each rendered frame. With AFR, that delta is large enough to perceive as interrupting or inhibiting the even "flow" of frames rendered.
 
I have seen this issue for awhile and it drives me nuts as Trepidati0n stated! I am surprised that it wasn't brought up awhile back. When I talked about stuff like this, people told me I as seeing things. lol
 
Excellent post OP. I've seen it in the few SLI rigs I've worked with, but nice to see the data behind it :).
 
So SFR would eliminate this? What about that "supertiling" mode that CF was supposed to support? It looks like the problem is not multi-GPU per se but AFR-only rendering. This could be eliminated in drivers. I'm hoping that 4870x2 will not have this problem, but even if it doesn't (due to some engineering wizardry) GTX 280 SLI and 4870 multi-slot CF is still going to have it.
 
It's GPU to GPU sync. Alternate frame rendering across two GPU's is the cause of it. v-sync helps, the sli AA modes and split frame rendering *might* help too.
 
This problem can definitely be addressed at the driver level. If you follow the link to Anandtech above, someone has already implemented a test which attempts to eliminates microstuttering. The results were positive.

Follow here -

Your first video card is 2ms behind, simply delay the second video card's frame by 2ms for an even flow. Make sense?
 
This problem can definitely be addressed at the driver level. If you follow the link to Anandtech above, someone has already implemented a test which attempts to eliminates microstuttering. The results were positive.

Follow here -

Your first video card is 2ms behind, simply delay the second video card's frame by 2ms for an even flow. Make sense?

From the chart you listed 2ms is common on a single card and woudln't be noticeable to the end-user (since no one complains about this on a single card setup). It's the wild 30-50ms swings in variance that seems to be what people are noticing. And if you delay a frame by 50 ms because the other cards last frame was late you drop your average frame rate severely.
 
Well that was just an example. You average the delta of the delay appropriately. Like I said, it's been done already. Proof of concept so to speak. Now all we need is for the two major vendors to implementt it.
 
Video lag is just that, lag. By delaying card #2 by the amount of lag #1 has, we've eliminated the delta. Remember - Microstutter < > Stutter. (does not equal)
 
That chart is deceptive. No video card renders at the exactly the same rate the whole time. Just look at a fraps chart. It'll be all over the place even on a single card. So microstuttering isn't just a multi-GPU issue. If a single GPU renders frame A in 20ms, Frame B in 22ms, frame C in 18ms ... would you not see the same effect?

that chart is over the course of 30 frames. In the course of 30 frames (less then a second... well as you can see there ~.7 secs).

I knew this would come up and I'm going to have to incorperate it somehow. What your suggesting is nearly impossible. The odds that the frame's content has changed so dramatically that the GPU requires a whole additional 4 ms of time to render would be extremely rare. A sharp drop or an increase in FPS, even as sudden as some of [H]s charts indicate, when scaled to fit this chart would be a long curve, not s spike. I'll work on that now actually...

Made a chart to illustrate my point:

Here is an FPS graph thats "all over the place" right?
FPSspiketomicrostuttercopy.jpg

(taken from [H] GTX280 review. The red and white line is two 9800GTX's in SLI. and shopped a bit, change the scale to demonstrate his example point better).

The Frames vs Time graph is cumulative where as the FPS vs time graph is not. That combined with the fact that the FPS graphs time axis is 25 times smaller than that of the first chart I posted means when converted even a sharp spike (the one between the two white lines in the above pic) translates into something like this:
Frametimes9800GX2copy.jpg


(note: I only used a handful of mathmatically calculated points, and I exaggerated the arch a bit, it should be even flatter.)

Supposedly, the 4870x2 has been redesigned to mitigate microstuttering, but I'll believe it when I see it.

+1 to the OP for posting this.

Many such rumors are circulating. Its not that hard to do --all you need is some sort of compiler chip that just adds a little bit of latency to the frames that come in early. I'd imagine its easy (and even cheap) to do through hardware. Doing it through the drivers on the other hand, with current hardware, I donno.

and thanks :)
 
Made a chart to illustrate my point:

Here is an FPS graph thats "all over the place" right?
http://i119.photobucket.com/albums/o142/MrWizard6600/FPSspiketomicrostuttercopy.jpg
(taken from [H] GTX280 review. The red and white line is two 9800GTX's in SLI. and shopped a bit, change the scale to demonstrate his example point better).

The Frames vs Time graph is cumulative where as the FPS vs time graph is not. That combined with the fact that the FPS graphs time axis is 25 times smaller than that of the first chart I posted means when converted even a sharp spike (the one between the two white lines in the above pic) translates into something like this:
http://i119.photobucket.com/albums/o142/MrWizard6600/Frametimes9800GX2copy.jpg

(note: I only used a handful of mathmatically calculated points, and I exaggerated the arch a bit, it should be even flatter.)



Many such rumors are circulating. Its not that hard to do --all you need is some sort of compiler chip that just adds a little bit of latency to the frames that come in early. I'd imagine its easy (and even cheap) to do through hardware. Doing it through the drivers on the other hand, with current hardware, I donno.

and thanks :)


I understood it when you explained it the first time.
 
I understood it when you explained it the first time.

eh but others might not. Its a good point and I wanna make the guy who doesnt know what a graphics card is to be able to understand it.... lol

gotta shrink it though and if I do my 3pixel wide vertical white lines on the first pic become illegible.
 
Have you ever played Quake Wars? I have 1 video card and I swear Quake Wars suffers from this same microstuttering problem, but with 1 card. This is an excellent thread. If I had to deal with what Quake Wars does in every game because I have a SLI, Crossfire, or X2 card...holy hell.
 
Many such rumors are circulating. Its not that hard to do --all you need is some sort of compiler chip that just adds a little bit of latency to the frames that come in early. I'd imagine its easy (and even cheap) to do through hardware. Doing it through the drivers on the other hand, with current hardware, I donno.

Actually the X2 will use a new feature in GDDR5 that allows the bandwidth of a chip be split between two GPUs (Clamshell mode). This allows a common ram space and making two GPU act as one or as close as it gets without MCM. Not only will you get rid of micro-stuttering you also get much better scaling, as the GPU can talk to each other through the RAM at high speed.

Two 4870X2 CF will still have AFR though. But hopefully if CF'ed X2's have enough brute force you can just use Vsync without fear of FPS dipping to 30 or 15.
 
I went from a g92 gts to a gx2 and while Im happy with the performance I do see the microstutter. Untill reading this I was WTF@#$!!!!! is going on. In cod4 when I go from calm to all hell breaking lose in a split second. It felt like my screen was wobbling which never happened with my gts.
 
I have a question for those who sees microstuttering. Is it bad enough that you are considering to go back to single video card setup?
 
I was under the impression that microstuttering referred to those split-second freezes you sometimes get, often in quick succession. It seems like a strange name for what's being identified in this thread to me.

Anyway, I must have experienced it the other day playing Crysis multiplayer. I was trying to shoot a helicopter down with a missile launcher, but every time a cloud came into view while zoomed in, my fps felt like it was dropping to about 10, though it still said it was in the 40s or so. That was on a very high config that I'm no longer using at the moment.
 
Personnally I dont care much about microsuttering, it is not nearly as bad as some here make it out to be,you get stuttering with every card outhere if pushed hard enough...

I still have 30 days to decide and may step up my GX2 SSC to a GTX280 as I beleive in the long run the 280 will "truly" distance itself from the GX2 as the single card top performer.
 
This sounds like the perfect candidate for a driver fix.

To work, ATi and nVidia!
 
Back
Top