Sign in with
Sign up | Sign in

G-Sync Technology Preview: Quite Literally A Game Changer

G-Sync Technology Preview: Quite Literally A Game Changer
By , Chris Angelini

You've forever faced this dilemma: disable V-sync and live with image tearing, or turn V-sync on and tolerate the annoying stutter and lag? Nvidia promises to make that question obsolete with a variable refresh rate technology we're previewing today.

A Brief History of Fixed Refresh Rates

A long time ago, PC monitors were big heavy items that contained curiously-named components like cathode ray tubes and electron guns. Back then, the electron guns shot at the screen to illuminate the colorful dots we call pixels. They did this one pixel at a time in a left-to-right scanning pattern for each line, working from the top to the bottom of the screen. Varying the electron guns' speed from one complete refresh to the next wasn't very practical, and there was no real need since 3D games were still decades away. So, CRTs and the associated analog video standards were designed with fixed refresh rates in mind.

LCDs eventually replaced CRTs, and digital connections (DVI, HDMI, and DisplayPort) replaced the analog ones (VGA). But the boards responsible for setting video signal standards (with VESA on top) haven't shifted away from those fixed refresh rates. Movies and television, after all, still rely on an input signal with a constant frame rate. Again, the need for a variable refresh didn’t seem so important.

Variable Frame Rates and Fixed Refresh Rates Don’t Match

Until the advent of advanced 3D graphics, a fixed refresh rate for displays was never an issue. However, an issue surfaced as we started getting our hands on powerful graphics processors: the rate at which GPUs render individual frames (what we refer to as their frame rate, commonly expressed in FPS or frames per second) isn’t constant. Rather it varies over time. A given card might be able to generate 30 frames per second in a particularly taxing scene and then 60 FPS moments later when you look up into an empty sky.

V-sync off makes you vulnerable to severe tearing on-screenV-sync off makes you vulnerable to severe tearing on-screen

As it turns out, variable frame rates coming from a graphics card and fixed refresh rates on an LCD don't work particularly well together. In such a configuration, you end up with an on-screen artifact that we call tearing. This happens when two or more partial frames are rendered together during one monitor refresh cycle. They're typically misaligned, yielding a very distracting effect corresponding to motion.

The image above shows two well-known artifacts, which are commonly seen, but often difficult to document. Because these are display artifacts, they don't show up in regular screenshots taken in-game, but instead represent the image you actually experience. You need a fast camera to accurately capture and display them. Or if you have access to a capture card, which is what we use for our FCAT-based benchmarking, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to another. At the end of the day, though, the best way to see these effects is with your own eyes.

You can see the tearing effect in both images above, taken with a camera, and the one below, captured through a card. The picture is cut horizontally, and appears misaligned. In the first shot, we have a 60 Hz Sharp screen on the left and a 120 Hz Asus display on the right. Tearing at 120 Hz is naturally less pronounced since the refresh is twice as high. However, it's still noticeable in similar ways. This type of visual artifact is the clearest indicator that the pictures were taken with V-sync disabled.

Battlefield 4 on a GeForce GTX 770 with V-sync disabledBattlefield 4 on a GeForce GTX 770 with V-sync disabled

The other issue we see in the BioShock: Infinite comparison shot is called ghosting, and it's particularly apparent at the bottom of the left side. This is due to screen latency. To make a long story short, individual pixels don't change color quickly enough, leading to this type of afterglow. The in-game effect is far more dramatic than a still image can convey. A panel with 8 ms gray-to-gray response time, such as the Sharp, will appear blurry whenever there's fast movement on-screen. That's why those displays typically aren't recommended for first-person shooters.

V-sync: Trading One Problem For Another

Vertical synchonization, or V-sync, is a very old solution to the tearing problem. Enabling V-sync essentially tells the video card to try to match the screen's refresh, eliminating tearing entirely. The downside is that, if your video card cannot keep up and the frame rate dips below 60 FPS (on a 60 Hz display), effective FPS bounces back and forth among integer multiples of the screen's refresh rate (so, 60, 30, 20, 15 FPS, and so on), which in turn causes perceived stuttering.

When frame rate drops below refresh, you encounter stuttering with V-sync onWhen frame rate drops below refresh, you encounter stuttering with V-sync on

Furthermore, because it forces the video card to wait and sometimes relies on a third back buffer, V-sync can introduce additional input lag in the chain. Thus, V-sync can be both a blessing and a curse, trading one compromise for another set of compromises. An informal survey around the office suggests that most gamers keep V-sync off as a general rule, turning it on only when the tearing artifacts become unbearable.

Getting Creative: Nvidia Introduces G-Sync

With the launch of its GeForce GTX 680, Nvidia enabled a driver mode called Adaptive V-sync, which attempted to mitigate the issues with V-sync by turning it on at frame rates above the monitor's refresh rate, and then quickly switching it off if instantaneous performance dropped below the refresh rate. Although this technology did its job well, it was really more of a workaround and did not prevent tearing when the framerate dropped below the display's refresh.

The introduction of G-Sync is much more interesting. Nvidia is basically showing that, instead of forcing video cards to display games on monitors with a fixed refresh, we can make the latest screens work at variable rates.

The GPU's frame rate determines the monitor's refresh, eliminating the artifacts of V-sync on or offThe GPU's frame rate determines the monitor's refresh, eliminating the artifacts of V-sync on or off

DisplayPort’s packet-based data transfer mechanism provided a window of opportunity. By using variable blanking intervals in the DisplayPort video signal, and replacing a monitor scaler with a module that works with a variable blanking signal, an LCD can be driven at a variable refresh rate aligned to whichever frame rate the video card is putting out (up to the screen's refresh rate limit, of course). In practice, Nvidia is taking a creative approach in leveraging specific capabilities enabled by DisplayPort, taking the opportunity to kill two birds with one stone.

Even before we jump into the hands-on testing, we have to commend the creative approach to solving a very real problem affecting gaming on the PC. This is innovation at its finest. But how well does G-Sync work in practice?

Nvidia sent over an engineering sample of Asus' VG248QE with its scaler replaced by a G-Sync module. We're already plenty familiar with this specific display; we reviewed it in Asus VG248QE: A 24-Inch, 144 Hz Gaming Monitor Under $300 and it earned a prestigious Tom's Hardware Smart Buy award. Now it's time to preview how Nvidia's newest technology affects our favorite games.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 9 comments.
This thread is closed for comments
  • 1 Hide
    ubercake , 12 December 2013 15:14
    I'm on page 4, and I can't even contain myself.

    Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.

    This is awesome outside of the box tech.
  • 2 Hide
    ubercake , 12 December 2013 15:19
    I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions!
  • 0 Hide
    mauller07 , 12 December 2013 20:20
    This is why you use Triple buffering with v-sync, the majority of these problems do not occur and its not as big a problem as nvidia make it seem.
    They supposedly had a vsync and triple buffered demo against the g-sync monitor and it was tearing all over the place, i have never had any tearing or the problems they showed when using v-sync and triple buffering.
    of course i still get the odd bit of stutter on occasion but its nothing like nvidia made it out to be, i am sure they used the worse case possible to demonstrate g-sync.

    g-sync would be more valued if nvidia made it an open technology although i cant see how they could possibly stop others doing this since its just changing the way the scaler refreshes on a monitor using the timing signal from the displayport packet to initiate a screen refresh, i can see it being hacked very quickly to work on any card, there is nothing in hardware that is required on the gpu end except a displayport connection.
  • 0 Hide
    gofasterstripes , 13 December 2013 08:42
    So, uh, this article kinda reads like the g-sync modules will be available to add-in to an existing display? Or not?
  • 0 Hide
    gofasterstripes , 13 December 2013 08:43
    So, mauller - whats the daughtercard in the pic for then? Is it a red herring?
  • 0 Hide
    Mousemonkey , 13 December 2013 09:05
    Quote:
    So, uh, this article kinda reads like the g-sync modules will be available to add-in to an existing display? Or not?


    Only one or maybe a couple but not all.
  • 0 Hide
    Algernon Ex , 13 December 2013 11:14
    fail.
  • 0 Hide
    mauller07 , 13 December 2013 13:41
    Quote:
    So, mauller - whats the daughtercard in the pic for then? Is it a red herring?


    the card in the pic is what replaces the scaler unit inside a compatible monitor, nothing is different on the pc side.
  • 0 Hide
    mi1ez , 17 December 2013 04:33
    Quote:
    And so we have to rely on carefully-written and eloquently delivered words.


    This line had me in pieces!