Why do higher resolutions show less variance in average game fps between different cpu's?

I noticed that at 1080p there can be a fair bit of difference between the average frame rate of certain games depending on the cpu being used. However, at higher resolutions the cpu matters less and the difference of average frame rates get much smaller to being no different at all. If at 1080p and above the gpu is reaching full 100% usage whats going on with the cpu that makes it more influential on framerate at lower resolutions than higher ones? I'm just generally curious.

an example would be that of an i5 vs an i7. at 1080p the i7 may get 15-20 more fps in certain games, at 1440 only get 5-8 more, and 4k as little as 1-3 more. Yet, at all three resolutions the gpu is reaching 100% utilization.
2 answers Last reply Best Answer
More about higher resolutions show variance average game fps cpu
  1. higher resolution strains the gpu more so you get worse frames it does not depend on cpu as much for resolution changes( although it still does)
  2. Best answer
    At low resolutions, especially in multiplayer games, the CPU is usually the bottleneck, at least with upper tier cards... bump up to 4k and every card made today will struggle and even any pair of cards in SLI / CFX has trouble keeping up in many games,
Ask a new question

Read More

Games CPUs FPS Systems