AMD and Intel continue serving up increasingly faster CPUs. But graphics card performance is accelerating even faster. Is there still such a thing as processor-bound gaming? We take two Radeon HD 7970s, high-end desktop CPUs, and a few games to find out.
We've seen processor performance double every three to four years. And yet, some of the most demanding game engines we've tested are as old as the Core 2 Duo that still resides in my office PC. Surely, CPU bottlenecks would be a thing of the past, right? Well, as it turns out, GPU performance speeds ahead at an even faster rate than that of host processors. And so, the debate over whether to buy a faster CPU or even more graphics muscle rages on.
There comes a point where it's pointless to continue the battle, though. For us, that happened when our games ran smoothly at our largest monitor's 2560x1600 native resolution. It simply didn't matter if a faster component took us from an average of 120 to 200 frames per second.
In response to the stagnation caused by increasingly faster components, but limited resolutions, AMD introduced its Eyefinity technology as Nvidia responded with Surround. Both expand beyond a single display, making 5760x1080 a very playable resolution on high-end GPUs. In fact, a trio of 1920x1080 displays is both less expensive and more engrossing than a single 2560x1600 screen, giving us the perfect excuse to splurge on some extra pixel-pushing power.
But does a display surface stretching 5760x1080 require any additional processing muscle in order to prevent bottlenecks? Ah, suddenly that becomes an interesting question again.

Up until now, when we've used AMD's GPUs, we've typically paired them with its competition's processors. Is such a move backed by hard data? Previously, based on plenty of benchmark results, we would have said so. However, the company has a new architecture available, so we bought a boxed FX-8350 to challenge prior convention. After all, there was a lot to like in AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?
Entering this contest at a heavy economical disadvantage, Intel’s Core i7-3770K needs to prove that it's not only faster than the AMD chip in games, but fast enough to overcome its price premium in our value analysis.
Although both of the motherboards we're using come from Asus' Sabertooth family, the company charges more for its LGA 1155-equipped model, further complicating the value story for Intel. We picked these platforms specifically to achieve the ultimate fairness from a performance standpoint, without pricing getting in the way.
- Chasing Bottlenecks To Eyefinity (But Not Beyond)
- Test Settings And Benchmarks
- Results: 3DMark, Aliens Vs. Predator, And Metro 2033
- Metro 2033, Second By Second
- Results: Battlefield 3, F1 2012, And Skyrim
- Battlefield 3, Frame By Frame
- Skyrim, Frame By Frame
- Power And Efficiency
- Can AMD's FX Keep Up With Its Radeon HD 7970?


What you're paying a premium for (it seems,) is a fluid experience. If you buy an Nvidia SLI/Intel combo, your wallet will take a hit, but you're in for less stutter. Go for all AMD crossfire/CPU, and you might just notice the occasional stutter - Not a problem for casual gamers, but a bit of a nuisance when you're twitch gaming. My aging hardware often causes problems in BF3 - flipping round a corner, spinning round etc - God help you if your GPU freezes for just a split second.
Is there a cunning (or daft) reason that Nvidia cards wouldn't work with AMD processors? If not then wouldn't it make sense to test that too - perhaps go down the route of seeing what the slowest CPU from each side is that gets up to a certain level (either absolute rates or a percentage of the top Intel result or somesuch)
I totally agree with this. Nvidia should have been tested as well.
That's what it looks like to me.
A2 can go up to about 60fps depending on the mission but it seems the AI and some other stuff is very processor heavy (and apparently it doesn't work as well on AMD CPUs anyway) which results in the GPU only being used 30-40%. Some of the worst missions seem to be the official campaigns, which apparently use a lot of scripting and drag it down to 17fps at times.
BS2 never goes above about 40fps and drops to about 25fps or lower whenever there's several other plane/helo models in view, such as flying towards an airfield and again is only using about 40% of the GPU most of the time. Being a 64-bit game, it was also disappointing to find it only uses about 2GB of my 16GB RAM, so I made a 11GB RAMDisk with it instead to make it load faster and eliminate the stuttering/jitters.
To be honest its' rather irritating when you have seen the video more than once, and it keeps on opening every time you just want to read a review. I respect the need to raise advertising revenue especially in these difficult times, I just find the way the video content automatically loads to be somewhat frustrating.
Thank you.
I know for one that i do. Yes i have 2 x 7950s in c/f. do i want more than one monitor no... Do i want to game at 5760 x 1080.. err no i dont..
and the Sabertooth Z77 is a Pci-e 3.0
For equal comparison they needed the Sabertooth 990FX R2.0 Gen3.
the difference from the Sabertooth 990fx (B.1604) and Saberooth R2.0 Gen3 (B.0305)
is about 10%~20% (it differs a lot per website reviews in, Games, Resolutions)
Not to be a conspiracy nut but the reviews should have known this from the start.
not sure if they are going down hill with people or if they made an honest mistake.