Sign in with
Sign up | Sign in

Do I Need To Worry About Input Lag?

The Myths Of Graphics Card Performance: Debunked, Part 1
By

Myth: Graphics Cards Affect Input Lag

Let’s say you’re getting shot up in your favorite multi-player shooter before you have the chance to even react. Is your opposition really that much better than you? Could they be cheating? Or is something else going on?

Aside from the occasional cheat, which does happen, the truth might be that those seemingly super-human reflexes are at least partly assisted by technology. And they might have very little to do with your graphics card.

It takes time for what happens in a game to show up on your screen. It takes from for you to react. And it takes time for your mouse and keyboard inputs to register. Somewhat improperly, the delay between you issuing a command and the on-screen action is commonly called input lag. So, if you press the trigger in a first-person shooter and your weapon fires .1 seconds later, your input lag is effectively 100 milliseconds. 

Human reaction times to visual inputs vary. According to a 1986 U.S. Navy study, the average F-14 fighter pilot reacted to a simple visual stimulus in an average of 223 ms. And it might not seem correct, but human beings actually react faster to sound than visual inputs. Reactions to auditory stimuli tend to be in the ~150 ms range.

If you're curious, you can test for yourself how quickly you react to either by clicking the simple visual test and then the audio test.

Fortunately, no matter how poorly-configured your PC may be, it probably won't hit 200 ms of input lag. So, your personal reaction time remains the biggest influencer of how quickly your character responds in a game.

As differences in input lag increase, however, they increasingly do affect gameplay. Imagine a professional gamer with reflexes comparable to the best fighter pilots at 150 ms. A 50 ms slow-down in input means that person will be 30% slower (that's four frames on a 60 Hz display) than his competition. At the professional level, that's notable.

For mere mortals (including me; I scored 200 ms in the visual test linked above), and for anyone who would rather play Civilization V leisurely than Counter Strike 1.6 competitively, it’s an entirely different story; you can likely ignore input lag altogether.

Here are some of the factors that can worsen input lag, all else being equal:

  • Playing on an HDTV (even more so if its game mode is disabled) or playing on an LCD display that performs some form of video processing that cannot be bypassed. Check out DisplayLag's Input Lag database for a great list organized by model.
  • Playing on LCD displays, which employ higher-response time IPS panels (5-7 ms G2G typical), versus TN+Film panels (1-2 ms GTG possible), versus CRT displays (the fastest available).
  • Playing on displays with lower refresh rates; the newest gaming displays support 120 or 144 Hz natively.
  • Playing at low frame rates (30 FPS is one frame every 33 ms; 144 FPS is one frame every 7 ms).
  • Using a USB-based mouse with a low polling rate. The default 125 Hz is a ~6 ms cycle time, yielding a ~3 ms input lag on average. Meanwhile, gaming mice can go to ~1000 Hz for ~0.5 ms average input lag.
  • Using a low-quality keyboard (keyboard input lag is 16 ms typically, but can be higher for poor ones).
  • Enabling V-sync, especially so when using triple buffering as well (there is a myth that Direct3D does not implement triple buffering; the reality is that Direct3D does account for the option of multiple back buffers, but few games exploit this). Check out Microsoft's write-up, if you're technically inclined.
  • Playing with high render-ahead queues. The default in Direct3D is three frames, or 48 ms at 60 Hz. This figure can be increased to 20 for greater “smoothness” and dropped to one for increased responsiveness at the cost of greater frame time variance and, in some cases, somewhat lower FPS overall. There is no such setting as a zero setting; what zero does is simply reset to the default value of three. Check out Microsoft's write-up, if you're technically inclined.
  • Playing on a high-latency Internet connection. While this goes beyond what would be defined as input lag, if effectively stacks with it

Factors that do not make a difference include:

  • Using a PS/2 or USB keyboard (see a dedicated page in our article: Five Mechanical-Switch Keyboards: Only The Best For Your Hands)
  • Using a wireless or wired network connection (just try pinging your router if you don’t believe us; you should see ping times of less than 1 ms). 
  • Enabling SLI or CrossFire. The longer render queues required to enable these technologies are generally compensated by higher frame throughput.

Bottom Line: Input lag only matters in "twitch" games, and really matters only at highly competitive levels.

There is a lot more to input lag than just display technology or a graphics card. Your hardware, hardware settings, display, display settings, and application settings all influence this measurement.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 11 comments.
This thread is closed for comments
  • 0 Hide
    Jak_Sparra , 10 February 2014 13:38
    Can't wait for part 2. I have a Sapphire R9 290 Tri-X and am enjoying the smoothest gameplay I've ever experienced at 1920x1200 with ultra settings in games like BF4. BUT, I'm interested in getting a 2560x180p monitor for gaming. The only thing holding me back is that I'm worried how much of a drop in FPS I will see. Hope the next article covers stuff like that. Also, as more and more gamers get more system RAM, I'd love to see an article that covers what happens when you use RAM as a RAMdisc and stick the pagefile on it. Would it act nearly as fast as VRAM?
  • 0 Hide
    rolli59 , 10 February 2014 13:51
    Great article, waiting on part 2.
  • 0 Hide
    Jonathan Cave , 10 February 2014 15:11
    Great Article.
  • 0 Hide
    Jak_Sparra , 10 February 2014 15:43
    Can't wait for part 2. I have a Sapphire R9 290 Tri-X and am enjoying the smoothest gameplay I've ever experienced at 1920x1200 with ultra settings in games like BF4. BUT, I'm interested in getting a 2560x180p monitor for gaming. The only thing holding me back is that I'm worried how much of a drop in FPS I will see. Hope the next article covers stuff like that. Also, as more and more gamers get more system RAM, I'd love to see an article that covers what happens when you use RAM as a RAMdisc and stick the pagefile on it. Would it act nearly as fast as VRAM?
  • 0 Hide
    Jonathan Cave , 10 February 2014 15:47
    Great Article.
  • 0 Hide
    kyzarvs , 10 February 2014 15:54
    @Jak - this is covered on page 6?"What happens when graphics memory is completely consumed? The short answer is that graphics data starts getting swapped to system memory over the PCI Express bus. Practically, this means performance slows dramatically, particularly when textures are being loaded. You don't want this to happen. It'll make any game unplayable due to massive stuttering."
  • 0 Hide
    Sunius , 10 February 2014 16:40
    Hey, about memory usage and Windows AERO: did you try benchmarking peak memory usage between various windows versions when in fullscreen mode? Going to fullscreen mode should effectively make Windows use 0 graphics memory as far as it's concerned (hence why it takes a while to switch to and out of fullscreen - it is moving data out and into video memory).
  • 0 Hide
    Wossnames , 11 February 2014 14:05
    10 dB isn't "twice as loud". It is ten times the sound pressure. 3 dB is about the double sound pressure.However, as we humans do not perceive sound linearly, around 6 dB (actually about four times the sound pressure) is generally perceived as "twice as loud".
  • 0 Hide
    HEXiT , 12 February 2014 22:19
    nice... pretty much confirms what i was thinking about overclocking, the results little in the way of real performance gains.for your next foray into overclocking could you do real world cpu performance. as gamers may well be in for a shock... with very limited returns for a massive overclock, power draw and reduced cpu life.productivity on the other hand can bring real returns... so it would be nice to have this confirmed.
  • 0 Hide
    cdrkf , 13 February 2014 11:16
    One myth I think you missed and really should highlight with respect to graphics memory is that as you say the amount doesn't effect performance, but the TYPE of memory really does.There are a lot of lower end cards popping up equipped with large amounts of DDR3 memory (e.g. an R9 250 with 2gb DDR3), and these are categorically a worse buy than a similarly priced card equipped with less but faster GDDR 5 memory...
  • 0 Hide
    jabel_sk , 23 February 2014 00:42
    Skyrim is a not a good game to benchmark VRAM because it's a horrible port. However, the modding community has in what some would argue, redesigned the engine's memory allocation system altogether.