Speed is the first dimension that comes to mind in a graphics card evaluation. How much faster is the latest and greatest than whatever came before? The Internet is littered with benchmarking data from thousands of sources trying to answer that question.
So, let's start by exploring speed and the variables to consider if you really want to know how fast a given graphics card is.
Myth: Frames rate is the indicator of graphics performance
Let's start with something that the Tom's Hardware audience probably knows already, but remains a misconception elsewhere. Common wisdom suggests that for a game to be playable, it should run at 30 frames per second or more. Some folks believe lower frame rates are still alright, and others insist that 30 FPS is far too low.
In the debate, however, it's not always reinforced that FPS is just a rate, and there is a host of complexity behind it. Most notably, while the frame rate of a movie is constant, a rendered game varies over time and is consequently expressed as an average. Variation is a byproduct of the horsepower required to process any given scene, and as the on-screen content changes, so does frame rate.
The simple point is that there is more to quality of a gaming experience than the instantaneous (or average) rate at which frames are rendered. The consistency of their delivery is an additional factor. Imagine traveling on a highway at a constant 65 MPH compared to the same trip at an average of 65 MPH, spending a lot more time switching between accelerator and brake. You reach your destination in roughly the same amount of time, but the experience is quite a bit different.
So, let's set the question "How much performance is enough?" aside for a moment. We'll get back to it after touching a few other relevant topics.
Introducing V-sync
Myths: Frame rates over 30 FPS aren't necessary; the human eye can't tell a difference. Values above 60 FPS on a 60 Hz display aren't necessary; the monitor is already refreshing 60 times a second. V-sync should always be enabled. V-sync should always be disabled.
How are rendered frames actually displayed? Because of the way almost all LCD displays work, the image on-screen is updated a fixed number of times per second. Typically, the magic number is 60, though there are also 120 and 144 Hz panels capable of more refreshes per second. When you talk about this mechanism, you're referring to the refresh rate, which of course is measured in Hertz.

Now, the mismatch between the graphics card's variable frame rate and the display's fixed refresh rate can be problematic. When the former happens faster than the latter, you end up with multiple frames displayed in the same scan, resulting in an artifact called screen tearing. In the image below, the colored bars denote unique frames from the graphics card getting thrown up on-screen as they're ready. This can be highly distracting, particularly in a fast-paced shooter.
The image below shows another artifact commonly seen on-screen, but rarely documented. Because it's a display artifact, it doesn't show up in screen shots, but instead represents the image your eyes actually see. You need a fast camera to capture it. FCAT, which is what Chris Angelini used to create the traffic cone shot in Battlefield 4, does reflect tearing, but not the ghosting effect I'm illustrating.

Screen tearing is evident in both of my BioShock Infinite images. But it's more evident on the 60 Hz Sharp than the 120 Hz Asus panel because the VG236HE runs at a refresh rate that's twice as high . This artifact is the clearest indicator that a game is running with V-sync, or vertical synchronization, disabled.
The other issue in the BioShock image is ghosting, which you can see especially in the bottom of the left image. This is attributable to screen latency. In short, individual pixels don't change color quickly enough and show this type of afterglow. The in-game effect is far more dramatic than my images suggest. An 8 ms gray-to-gray response time, which is what the Sharp screen on the left is specified for, appears blurry whenever fast movement happens on-screen.
Back to tearing. The aforementioned V-sync is an old solution to the problem, which synchronizes the rate at which the video card presents frames to the screen's refresh rate. Because multiple frames no longer show up in a single panel refresh, tearing is no longer an issue. However, if your crank up the graphics quality of your favorite title and its frame rate drops below 60 FPS (or whatever your panel's refresh is set to), then your effective frame rate bounces between integer multiples of the refresh, illustrated below. Now, you face another artifact called stuttering.

One of the Internet's oldest arguments is whether you should turn V-sync on or leave it off. Some folks insist it's one or the other, and some enthusiasts will change the setting based on the game they're playing.
So, V-sync On, Or V-sync Off?
Let's say you're in the majority and own a typical 60 Hz display:
- If you play first-person shooter games competitively, and/or have issues with perceived input lag, and/or if your system cannot sustain at least 60 FPS in a given title, and/or you're benchmarking your graphics card, then you should turn V-sync off.
- If none of the above applies to you and you experience significant screen tearing, then you should turn V-sync on.
- As a general rule, or if you don’t feel strongly either way, just keep V-sync off.
If you own a gaming-oriented 120/144 Hz display (if you have one, there's a good chance you bought it specifically for its higher refresh rate):
- You should consider leaving V-sync on only when playing older games, where you experience a sustained >120 FPS and you are experiencing screen tearing.
Note that there are certain cases where the frame rate-halving impact of V-sync doesn't apply, such as applications supporting triple buffering, though those cases aren't common. Also, in some games (like The Elder Scrolls V: Skyrim), V-sync is enabled by default. Forcing it off by modifying certain files can cause issues with the game engine itself. In those cases, you're best off leaving V-sync on.
G-Sync, FreeSync, and the Future
G-Sync Technology Preview: Quite Literally A Game Changer was a preview of Nvidia's solution to all of this. AMD made a somewhat feeble attempt at responding by showing off its FreeSync technology at CES 2014, though that might only be viable on laptops for now - that said, we applaud AMD's open-source approach to the technology as the right way to go. Both capabilities work around V-sync's compromises by allowing the display to operate at a variable refresh.
It is hard to say where the industry is heading, but as I mentioned in my G-Sync coverage, we're not fans of proprietary standards (and I bet most OEMs agree). I'd like to see Nvidia consider opening up G-Sync to the rest of the community, though we know from experience that the company tends not to do this.
- Performance That Matters: Going Beyond A Graphics Card's Lap Time
- Graphics Card Myth Busting: How We Tested
- To Enable Or Disable V-Sync: That Is The Question
- Do I Need To Worry About Input Lag?
- The Myths Surrounding Graphics Card Memory
- More Graphics Memory Measurements
- Thermal Management In A Modern Graphics Card
- Testing Performance At A Constant 40 dB(A)
- Can Overclocking Hurt Performance At 40 dB(A)?
Create a new thread in the UK Article comments forum about this subject
-
0 HideJak_Sparra , 10 February 2014 13:38Can't wait for part 2. I have a Sapphire R9 290 Tri-X and am enjoying the smoothest gameplay I've ever experienced at 1920x1200 with ultra settings in games like BF4. BUT, I'm interested in getting a 2560x180p monitor for gaming. The only thing holding me back is that I'm worried how much of a drop in FPS I will see. Hope the next article covers stuff like that. Also, as more and more gamers get more system RAM, I'd love to see an article that covers what happens when you use RAM as a RAMdisc and stick the pagefile on it. Would it act nearly as fast as VRAM?
-
0 Hiderolli59 , 10 February 2014 13:51Great article, waiting on part 2.
-
0 HideJonathan Cave , 10 February 2014 15:11Great Article.
-
0 HideJak_Sparra , 10 February 2014 15:43Can't wait for part 2. I have a Sapphire R9 290 Tri-X and am enjoying the smoothest gameplay I've ever experienced at 1920x1200 with ultra settings in games like BF4. BUT, I'm interested in getting a 2560x180p monitor for gaming. The only thing holding me back is that I'm worried how much of a drop in FPS I will see. Hope the next article covers stuff like that. Also, as more and more gamers get more system RAM, I'd love to see an article that covers what happens when you use RAM as a RAMdisc and stick the pagefile on it. Would it act nearly as fast as VRAM?
-
0 HideJonathan Cave , 10 February 2014 15:47Great Article.
-
0 Hidekyzarvs , 10 February 2014 15:54@Jak - this is covered on page 6?"What happens when graphics memory is completely consumed? The short answer is that graphics data starts getting swapped to system memory over the PCI Express bus. Practically, this means performance slows dramatically, particularly when textures are being loaded. You don't want this to happen. It'll make any game unplayable due to massive stuttering."
-
0 HideSunius , 10 February 2014 16:40Hey, about memory usage and Windows AERO: did you try benchmarking peak memory usage between various windows versions when in fullscreen mode? Going to fullscreen mode should effectively make Windows use 0 graphics memory as far as it's concerned (hence why it takes a while to switch to and out of fullscreen - it is moving data out and into video memory).
-
0 HideWossnames , 11 February 2014 14:0510 dB isn't "twice as loud". It is ten times the sound pressure. 3 dB is about the double sound pressure.However, as we humans do not perceive sound linearly, around 6 dB (actually about four times the sound pressure) is generally perceived as "twice as loud".
-
0 HideHEXiT , 12 February 2014 22:19nice... pretty much confirms what i was thinking about overclocking, the results little in the way of real performance gains.for your next foray into overclocking could you do real world cpu performance. as gamers may well be in for a shock... with very limited returns for a massive overclock, power draw and reduced cpu life.productivity on the other hand can bring real returns... so it would be nice to have this confirmed.
-
0 Hidecdrkf , 13 February 2014 11:16One myth I think you missed and really should highlight with respect to graphics memory is that as you say the amount doesn't effect performance, but the TYPE of memory really does.There are a lot of lower end cards popping up equipped with large amounts of DDR3 memory (e.g. an R9 250 with 2gb DDR3), and these are categorically a worse buy than a similarly priced card equipped with less but faster GDDR 5 memory...
-
0 Hidejabel_sk , 23 February 2014 00:42Skyrim is a not a good game to benchmark VRAM because it's a horrible port. However, the modding community has in what some would argue, redesigned the engine's memory allocation system altogether.