Sign in with
Sign up | Sign in

FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire

FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire
By

AMD and Intel continue serving up increasingly faster CPUs. But graphics card performance is accelerating even faster. Is there still such a thing as processor-bound gaming? We take two Radeon HD 7970s, high-end desktop CPUs, and a few games to find out.

We've seen processor performance double every three to four years. And yet, some of the most demanding game engines we've tested are as old as the Core 2 Duo that still resides in my office PC. Surely, CPU bottlenecks would be a thing of the past, right? Well, as it turns out, GPU performance speeds ahead at an even faster rate than that of host processors. And so, the debate over whether to buy a faster CPU or even more graphics muscle rages on.

There comes a point where it's pointless to continue the battle, though. For us, that happened when our games ran smoothly at our largest monitor's 2560x1600 native resolution. It simply didn't matter if a faster component took us from an average of 120 to 200 frames per second.

In response to the stagnation caused by increasingly faster components, but limited resolutions, AMD introduced its Eyefinity technology as Nvidia responded with Surround. Both expand beyond a single display, making 5760x1080 a very playable resolution on high-end GPUs. In fact, a trio of 1920x1080 displays is both less expensive and more engrossing than a single 2560x1600 screen, giving us the perfect excuse to splurge on some extra pixel-pushing power.

But does a display surface stretching 5760x1080 require any additional processing muscle in order to prevent bottlenecks? Ah, suddenly that becomes an interesting question again.

Up until now, when we've used AMD's GPUs, we've typically paired them with its competition's processors. Is such a move backed by hard data? Previously, based on plenty of benchmark results, we would have said so. However, the company has a new architecture available, so we bought a boxed FX-8350 to challenge prior convention. After all, there was a lot to like in AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?

Entering this contest at a heavy economical disadvantage, Intel’s Core i7-3770K needs to prove that it's not only faster than the AMD chip in games, but fast enough to overcome its price premium in our value analysis.

Although both of the motherboards we're using come from Asus' Sabertooth family, the company charges more for its LGA 1155-equipped model, further complicating the value story for Intel. We picked these platforms specifically to achieve the ultimate fairness from a performance standpoint, without pricing getting in the way.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 10 comments.
This thread is closed for comments
  • 1 Hide
    redh4t , 24 January 2013 17:44
    So i5-3570K gaming value shoul be in the first place compared o i7-3770K and AMD's FX-8350. Right?
  • 0 Hide
    Steveymoo , 24 January 2013 18:08
    Spikes and microstutter are really the definitive difference between manufacturers now. Most people only own a 60HZ flat panel monitor, so if you buy mid to top end hardware, you will hardly see framerates drop below 60.

    What you're paying a premium for (it seems,) is a fluid experience. If you buy an Nvidia SLI/Intel combo, your wallet will take a hit, but you're in for less stutter. Go for all AMD crossfire/CPU, and you might just notice the occasional stutter - Not a problem for casual gamers, but a bit of a nuisance when you're twitch gaming. My aging hardware often causes problems in BF3 - flipping round a corner, spinning round etc - God help you if your GPU freezes for just a split second.
  • 1 Hide
    swamprat , 24 January 2013 19:58
    Quote:
    Our benchmark results have long shown that ATI's graphics architectures are more dependent on a strong processor than Nvidia's.

    Is there a cunning (or daft) reason that Nvidia cards wouldn't work with AMD processors? If not then wouldn't it make sense to test that too - perhaps go down the route of seeing what the slowest CPU from each side is that gets up to a certain level (either absolute rates or a percentage of the top Intel result or somesuch)
  • 1 Hide
    mactronix , 24 January 2013 22:39
    swampratIs there a cunning (or daft) reason that Nvidia cards wouldn't work with AMD processors? If not then wouldn't it make sense to test that too - perhaps go down the route of seeing what the slowest CPU from each side is that gets up to a certain level (either absolute rates or a percentage of the top Intel result or somesuch)


    I totally agree with this. Nvidia should have been tested as well.

  • 0 Hide
    Daedalus12 , 24 January 2013 22:55
    redh4tSo i5-3570K gaming value shoul be in the first place compared o i7-3770K and AMD's FX-8350. Right?


    That's what it looks like to me.
  • 1 Hide
    doveman , 25 January 2013 00:04
    All very interesting but try testing with Arma2OA and DCS World/Black Shark 2. With my 6950 2GB and Phenom II X4 955 overclocked to 3.8Ghz, I often get 20fps or less in A2 and 25-35fps in BS2 on a single 1920x1200@60hz display.

    A2 can go up to about 60fps depending on the mission but it seems the AI and some other stuff is very processor heavy (and apparently it doesn't work as well on AMD CPUs anyway) which results in the GPU only being used 30-40%. Some of the worst missions seem to be the official campaigns, which apparently use a lot of scripting and drag it down to 17fps at times.

    BS2 never goes above about 40fps and drops to about 25fps or lower whenever there's several other plane/helo models in view, such as flying towards an airfield and again is only using about 40% of the GPU most of the time. Being a 64-bit game, it was also disappointing to find it only uses about 2GB of my 16GB RAM, so I made a 11GB RAMDisk with it instead to make it load faster and eliminate the stuttering/jitters.
  • 0 Hide
    jonboy79 , 27 January 2013 17:19
    Fx 6300 needed to be in there too.
  • 0 Hide
    wild9 , 2 February 2013 21:31
    [OT] Is there any way to disable to auto video playback on Tom's? It's having quite an impact on my system resources and as such I'm far less inclined to open multiple reviews. For instance I have 5 reviews open and my dual-core Athlon 64 rig is pulling 60% resources under Opera. Other sites work fine.

    To be honest its' rather irritating when you have seen the video more than once, and it keeps on opening every time you just want to read a review. I respect the need to raise advertising revenue especially in these difficult times, I just find the way the video content automatically loads to be somewhat frustrating.

    Thank you.
  • 0 Hide
    keyholder , 11 February 2013 06:09
    You say "But nobody games at 1920x1080 " dont they ...

    I know for one that i do. Yes i have 2 x 7950s in c/f. do i want more than one monitor no... Do i want to game at 5760 x 1080.. err no i dont..

  • 0 Hide
    Xerpadon Xerilious , 11 April 2013 09:57
    The setups are wrong, Sabertooth 990FX is a Pci-e 2.0
    and the Sabertooth Z77 is a Pci-e 3.0
    For equal comparison they needed the Sabertooth 990FX R2.0 Gen3.
    the difference from the Sabertooth 990fx (B.1604) and Saberooth R2.0 Gen3 (B.0305)
    is about 10%~20% (it differs a lot per website reviews in, Games, Resolutions)
    Not to be a conspiracy nut but the reviews should have known this from the start.
    not sure if they are going down hill with people or if they made an honest mistake.