Page 1:To Synchronize Or Not To Synchronize, That Is (No Longer) The Question
Page 2:3D LightBoost, On-Board Memory, Standards, And 4K
Page 3:60 Hz Panels, SLI, Surround, And Availability
Page 4:Getting G-Sync Working, And Our Test Setup
Page 5:Testing G-Sync Against V-Sync Enabled
Page 6:Testing G-Sync Against V-Sync Disabled
Page 7:Game Compatibility: Mostly Great
Page 8:Is G-Sync The Game-Changer You Didn’t Know You Were Waiting For?
You've forever faced this dilemma: disable V-sync and live with image tearing, or turn V-sync on and tolerate the annoying stutter and lag? Nvidia promises to make that question obsolete with a variable refresh rate technology we're previewing today.
A Brief History of Fixed Refresh Rates
A long time ago, PC monitors were big heavy items that contained curiously-named components like cathode ray tubes and electron guns. Back then, the electron guns shot at the screen to illuminate the colorful dots we call pixels. They did this one pixel at a time in a left-to-right scanning pattern for each line, working from the top to the bottom of the screen. Varying the electron guns' speed from one complete refresh to the next wasn't very practical, and there was no real need since 3D games were still decades away. So, CRTs and the associated analog video standards were designed with fixed refresh rates in mind.
LCDs eventually replaced CRTs, and digital connections (DVI, HDMI, and DisplayPort) replaced the analog ones (VGA). But the boards responsible for setting video signal standards (with VESA on top) haven't shifted away from those fixed refresh rates. Movies and television, after all, still rely on an input signal with a constant frame rate. Again, the need for a variable refresh didn’t seem so important.
Variable Frame Rates and Fixed Refresh Rates Don’t Match
Until the advent of advanced 3D graphics, a fixed refresh rate for displays was never an issue. However, an issue surfaced as we started getting our hands on powerful graphics processors: the rate at which GPUs render individual frames (what we refer to as their frame rate, commonly expressed in FPS or frames per second) isn’t constant. Rather it varies over time. A given card might be able to generate 30 frames per second in a particularly taxing scene and then 60 FPS moments later when you look up into an empty sky.
As it turns out, variable frame rates coming from a graphics card and fixed refresh rates on an LCD don't work particularly well together. In such a configuration, you end up with an on-screen artifact that we call tearing. This happens when two or more partial frames are rendered together during one monitor refresh cycle. They're typically misaligned, yielding a very distracting effect corresponding to motion.
The image above shows two well-known artifacts, which are commonly seen, but often difficult to document. Because these are display artifacts, they don't show up in regular screenshots taken in-game, but instead represent the image you actually experience. You need a fast camera to accurately capture and display them. Or if you have access to a capture card, which is what we use for our FCAT-based benchmarking, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to another. At the end of the day, though, the best way to see these effects is with your own eyes.
You can see the tearing effect in both images above, taken with a camera, and the one below, captured through a card. The picture is cut horizontally, and appears misaligned. In the first shot, we have a 60 Hz Sharp screen on the left and a 120 Hz Asus display on the right. Tearing at 120 Hz is naturally less pronounced since the refresh is twice as high. However, it's still noticeable in similar ways. This type of visual artifact is the clearest indicator that the pictures were taken with V-sync disabled.
The other issue we see in the BioShock: Infinite comparison shot is called ghosting, and it's particularly apparent at the bottom of the left side. This is due to screen latency. To make a long story short, individual pixels don't change color quickly enough, leading to this type of afterglow. The in-game effect is far more dramatic than a still image can convey. A panel with 8 ms gray-to-gray response time, such as the Sharp, will appear blurry whenever there's fast movement on-screen. That's why those displays typically aren't recommended for first-person shooters.
V-sync: Trading One Problem For Another
Vertical synchonization, or V-sync, is a very old solution to the tearing problem. Enabling V-sync essentially tells the video card to try to match the screen's refresh, eliminating tearing entirely. The downside is that, if your video card cannot keep up and the frame rate dips below 60 FPS (on a 60 Hz display), effective FPS bounces back and forth among integer multiples of the screen's refresh rate (so, 60, 30, 20, 15 FPS, and so on), which in turn causes perceived stuttering.
Furthermore, because it forces the video card to wait and sometimes relies on a third back buffer, V-sync can introduce additional input lag in the chain. Thus, V-sync can be both a blessing and a curse, trading one compromise for another set of compromises. An informal survey around the office suggests that most gamers keep V-sync off as a general rule, turning it on only when the tearing artifacts become unbearable.
Getting Creative: Nvidia Introduces G-Sync
With the launch of its GeForce GTX 680, Nvidia enabled a driver mode called Adaptive V-sync, which attempted to mitigate the issues with V-sync by turning it on at frame rates above the monitor's refresh rate, and then quickly switching it off if instantaneous performance dropped below the refresh rate. Although this technology did its job well, it was really more of a workaround and did not prevent tearing when the framerate dropped below the display's refresh.
The introduction of G-Sync is much more interesting. Nvidia is basically showing that, instead of forcing video cards to display games on monitors with a fixed refresh, we can make the latest screens work at variable rates.
DisplayPort’s packet-based data transfer mechanism provided a window of opportunity. By using variable blanking intervals in the DisplayPort video signal, and replacing a monitor scaler with a module that works with a variable blanking signal, an LCD can be driven at a variable refresh rate aligned to whichever frame rate the video card is putting out (up to the screen's refresh rate limit, of course). In practice, Nvidia is taking a creative approach in leveraging specific capabilities enabled by DisplayPort, taking the opportunity to kill two birds with one stone.
Even before we jump into the hands-on testing, we have to commend the creative approach to solving a very real problem affecting gaming on the PC. This is innovation at its finest. But how well does G-Sync work in practice?
Nvidia sent over an engineering sample of Asus' VG248QE with its scaler replaced by a G-Sync module. We're already plenty familiar with this specific display; we reviewed it in Asus VG248QE: A 24-Inch, 144 Hz Gaming Monitor Under $300 and it earned a prestigious Tom's Hardware Smart Buy award. Now it's time to preview how Nvidia's newest technology affects our favorite games.
- To Synchronize Or Not To Synchronize, That Is (No Longer) The Question
- 3D LightBoost, On-Board Memory, Standards, And 4K
- 60 Hz Panels, SLI, Surround, And Availability
- Getting G-Sync Working, And Our Test Setup
- Testing G-Sync Against V-Sync Enabled
- Testing G-Sync Against V-Sync Disabled
- Game Compatibility: Mostly Great
- Is G-Sync The Game-Changer You Didn’t Know You Were Waiting For?