Micro-Stuttering And GPU Scaling In CrossFire And SLI

Micro-Stuttering: So Subtle, Yet So Annoying

What Is Micro-Stuttering?

Let’s recap how successive frames are generated. Since the ideal "perfect balance" isn't achievable in the real world and the system cannot render, say, 30 frames in advance, the following scenario is typical:

What we have here is a close-up of one second of Metro 2033 game play, rendered on a pair of Radeon HD 6850 GPUs. We chose this title and hardware combination because it yields a frame rate very close to 30 FPS. That's where the micro-stuttering phenomenon, if it exists at all, is rumoured to be most aggravating.

Looking at the diagram, we immediately spot the dramatic variation in the number of milliseconds it takes to render each frame. The pie chart shows all 30 frames within the one-second period that we analysed. It's easy to notice the delay of some frames, which is, in turn, perceived as stuttering. And of course it doesn’t help when the next frame catches up to help keep the average frame rate high enough.

You end up with what feels like a stuttering engine. Yes, it's going 30 MPH just like a smooth inline-six. But this one hits the same speed and feels like it has one cylinder out of whack. .

When Can Micro-Stuttering Be Seen?

In a nutshell, all of the time. The lower the average frame rate, the more the frame rate is perceived as being lower than the actual average frame rate. Thus, as bad luck would have it, a frame rate of 30 FPS may be perceived as merely 20 to 25 FPS. The human eye does, however, still notice differences in when frames show up on-screen beyond 60 FPS.

This is one of the reasons why we prefer testing with higher frame rates in the GPU scaling tests on the following pages. It continues to amaze us how, even beyond the generally-accepted target of 40 FPS, you can still see the impact of micro-stuttering once rendering becomes imbalanced.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • Stupido
    Just an idea: maybe stuttering can be described as standard deviation of the mean frames per second?

    In this way you can have some quantitative measure of the stuttering...
  • aje21
    Will future SBM competitions take this into account? Multi-card setups seem to be very common. Perhaps the frame rate reported should be based on the low-point seen with micro-stuttering?
  • Anonymous
    Great work , this has cleared up alot of questions i had regarding micro stuttering , TBH i think im just gonna buy a gtx 580
    (was thinking of duel ) but after seeing this artical i have changed my mind, your frame rate is only as good as it's lowest point . Ave/top frame rate realy dont matter if you drop from 60 fps to 20fps and stutter. Thanx to all at toms.
  • technogiant
    Nice article....uhhmm...me wonders if the lower power consumption/ heat production of the next gen 28nm gpu's will allow for the production of X3 triple mid range gpu cards with a nice large lump of GDDR ram to produce their flagship rather than a microstutery X2 card?
  • pantsu
    Microstuttering is also heavily game dependant, some engines stutter more than others. The selection of games in the article is limited so I wouldn't take it at face value. Still, in the games tested it seems like Nvidia does a better job of reducing stutter. I'm personally considering buying a second 6950, but I might end up selling it if the stuttering is too noticeable.

    A good way to measure microstuttering is to use Fraps and its frametime log. You can create a graph from the values that shows the frame speeds for every single frame instead of one second avarages that mask the stuttering in normal fps graphs. ( = 1000ms/frametime-previousframetime)
  • alangeering
    I'd like to first thank Igor and Greg for a very insightful article and for

    discussing the not often talked about phenomenon of stuttering.

    There's one thing I'd like to expand upon.

    A few times in the article the observation is made that while dual GPU scaling is

    good, the stuttering effect is bad.
    No real point is made that when scaling is poor, stuttering is less pronounced.

    It's precisely because three cards aren’t as efficient that stuttering is reduced.

    Bear with me and I'll explain.

    For the following thought experiment I've used the data from the Call of Juarez

    graph on the page called "Step 2: Crossfire with three GPUs"

    Three situations:
    A: 1 card @ 70 fps average
    B: 2 cards @ 135 fps average
    c: 3 cards @ 160 fps average

    In other words:
    A: The card takes an average of 14.3 ms to produce the frame.
    B: Each card has 14.8 ms to produce the frame to maintain the average.
    C: Each card has 18.8 ms to produce the frame to maintain the average.

    Look again at the data from Call of Juarez.
    The lowest frame rate recorded for the single card is 60fps or 16.7 ms per frame.

    This is well below the 14.8 ms required to not delay/stutter the pipeline in

    situation B but...
    This is well within the 18.8 ms time frame for the 3 card set up in situation C.

    As frames are now arriving in time for use, the evidence of stuttering is reduced.

    So efficiency is good; but inefficiency in scaling allows each card a little

    longer to provide its frame, and the eventual combined frame rate is less


    A quote from the article:
    "This phenomenon manifests itself even more seriously in CoJ. While CrossFire

    scales well under load, it becomes even more susceptible to micro-stuttering."

    And another:
    "For some reason, the third GPU almost always eliminates micro stuttering and has

    a less-pronounced effect on performance."

    You got so close; it just needed another jump of statistical thinking. Efficiency

    correlates with stuttering (NVIDIA and AMD) and there is a logical reason why.
  • alangeering
    The above post isn't trying to explain why microstuttering occurs - only why it's more pronounced as multi-gpu scaling increases. (and less so as scaling efficiency decreases)
  • technogiant
    @alangeering......that's an interesting theory, following it through to its logical conclusion if you were to artificially cap the maximum frame rate below that which the graphics setup is capable of producing then this should also reduce stuttering...does imposing vsync of say 60Hz on a system capable of producing say 80fps reduce stuttering?
    Is it possible to impose various frame rate caps by editing game.ini files to test this?
  • alangeering
    @ technogiant
    This is exactly why some people have had success in using V-sync to hide the effect.

    In the case of this benchmark set-up: if you could reduce the frame rate to around 18 ms per frame per card for two cards then you should see the same smoothness as was achieved with three cards.

    Unfortunatly this means you get a 'smooth' frame rate of 111 fps - which doesn't win any benchmarks. (It's a lot less than 135 fps).

    If we were to reevaluate crossfire/sli setups based on a frame rate with acceptable microstuttering it would not nearly look as good for a 2-way setup vs a single card.
  • chronicbint
    Well this has just put me off going for my first crossfire setup.....
  • picture_perfect
    i appreciate you guys bringing attention to this problem, since i can't stand judder. i'd like to know how v-sync worked.

    comment; since microstuttering is most obvious at low fps, dual gpu systems still seem to have communication problems. i avoided sli/cf in my last build a couple years ago because of this (and the cost/efficiency thing). looks like i'll be doing the same thing again this year. thanks for the heads up.
  • Anonymous
    Just one question: why no review for 3-way/4-way SLI setups?
  • Gublo
    Thanks for the great article! I was on the 580/590 track for my new build but this article got me very interested in the 3x 6870. How would they handle being put in sandwich crossfire? Would it be to hot? And wouldn't that 1 gb vram be a problem with future (or Battlefield 3) AA and post efx requirements?
  • markdj
    Anonymous said:
    Just an idea: maybe stuttering can be described as standard deviation of the mean frames per second?

    In this way you can have some quantitative measure of the stuttering...

    I don't think the standard deviation of the usual frames per second measurements would give a good representation since the stuttering is a result of the variation on much lower than the second level.

    If you took the standard deviation of the pie chart of metro 2033, which was the time in milliseconds for each frame to be rendered over a one second period, I believe that would give a much better representation of micro-stutter since it is looking at a much smaller increment of time in much more detail.

    Using this std dev along with subjective viewing of the games I think you could come up with a way to judge it numerically. Have multiple people all view various game footage and see where microstutter becomes noticeable, then looking at that standard deviation could give you a number to judge future benches.

    This of course would leave some subjectivity for finding that number, does anyone have an idea to remove that subjectivity?