Sign in with
Sign up | Sign in

Can AMD's FX Keep Up With Its Radeon HD 7970?

FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire
By

When we talk about affordable hardware that performs well, we like to use phrases like "80% the performance for 60% the price." Those are always very honest numbers, since we make it a habit to measure performance, power, and efficiency. But they only capture the value of a single component, and components cannot operate on their own. 

After adding up the parts used in today's benchmark analysis, the Intel-based system crested $1,900, while the AMD platform ran us $1,724, both without cases, peripherals, or operating systems. If we wanted to call both setups "complete" solutions, we could add an $80 chassis to give us $1,984 and $1,804 machines, respectively. Since we're adding cost to both boxes, AMD's overall $180 cost savings becomes a smaller percentage of the total price tag. In other words, the other pieces that go into a nice high-end PC serve to diminish AMD's value leadership.

That leaves us with two completely biased ways to compare price to performance. We can only hope that pointing this out upfront keeps us transparent as we present the numbers.

An AMD bias would only include the price of the motherboard and CPU, maximizing value, like so:

A third alternative would allow us to talk about the motherboards and CPUs as upgrades, assuming you already have cases, power supplies, memory, and storage lying around. Of course, you probably don't have a pair of Radeon HD 7970s left over from some old machine, so the most balanced approach we can take at least takes processors, platforms, and graphics into consideration. Therefore, we're adding the $800 Tahiti-based duo to our shopping list.

The only way we can make AMD's FX-8350 look like a better gaming value than Intel's Core i7-3770K (specifically in the games and at the settings we used to test) is if the rest of the system is free. Because the rest of the system is never free, the FX-8350 never serves up better high-end gaming value.

From now on, we'll need to limit the use of AMD's flagship to systems already bottlenecked by their graphics cards. A less expensive CPU is more attractive when it isn't affecting performance negatively.

Intel Bias is in the (AMD) Cards?

Our benchmark results have long shown that ATI's graphics architectures are more dependent on a strong processor than Nvidia's. As a result, we usually arm our test beds with high-end Intel CPUs when it comes time to benchmark high-end GPUs, sidestepping platform issues that might adversely affect results designed to isolate graphics performance.

We were hoping that AMD's Piledriver update would break that trend, but even a handful of impressive advancements aren't enough to match the effectiveness of AMD's graphics team. Might Steamroller be the evolutionary step forward needed to unleash the GCN architecture's peak performance?

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 10 comments.
This thread is closed for comments
  • 1 Hide
    redh4t , 24 January 2013 17:44
    So i5-3570K gaming value shoul be in the first place compared o i7-3770K and AMD's FX-8350. Right?
  • 0 Hide
    Steveymoo , 24 January 2013 18:08
    Spikes and microstutter are really the definitive difference between manufacturers now. Most people only own a 60HZ flat panel monitor, so if you buy mid to top end hardware, you will hardly see framerates drop below 60.

    What you're paying a premium for (it seems,) is a fluid experience. If you buy an Nvidia SLI/Intel combo, your wallet will take a hit, but you're in for less stutter. Go for all AMD crossfire/CPU, and you might just notice the occasional stutter - Not a problem for casual gamers, but a bit of a nuisance when you're twitch gaming. My aging hardware often causes problems in BF3 - flipping round a corner, spinning round etc - God help you if your GPU freezes for just a split second.
  • 1 Hide
    swamprat , 24 January 2013 19:58
    Quote:
    Our benchmark results have long shown that ATI's graphics architectures are more dependent on a strong processor than Nvidia's.

    Is there a cunning (or daft) reason that Nvidia cards wouldn't work with AMD processors? If not then wouldn't it make sense to test that too - perhaps go down the route of seeing what the slowest CPU from each side is that gets up to a certain level (either absolute rates or a percentage of the top Intel result or somesuch)
  • 1 Hide
    mactronix , 24 January 2013 22:39
    swampratIs there a cunning (or daft) reason that Nvidia cards wouldn't work with AMD processors? If not then wouldn't it make sense to test that too - perhaps go down the route of seeing what the slowest CPU from each side is that gets up to a certain level (either absolute rates or a percentage of the top Intel result or somesuch)


    I totally agree with this. Nvidia should have been tested as well.

  • 0 Hide
    Daedalus12 , 24 January 2013 22:55
    redh4tSo i5-3570K gaming value shoul be in the first place compared o i7-3770K and AMD's FX-8350. Right?


    That's what it looks like to me.
  • 1 Hide
    doveman , 25 January 2013 00:04
    All very interesting but try testing with Arma2OA and DCS World/Black Shark 2. With my 6950 2GB and Phenom II X4 955 overclocked to 3.8Ghz, I often get 20fps or less in A2 and 25-35fps in BS2 on a single 1920x1200@60hz display.

    A2 can go up to about 60fps depending on the mission but it seems the AI and some other stuff is very processor heavy (and apparently it doesn't work as well on AMD CPUs anyway) which results in the GPU only being used 30-40%. Some of the worst missions seem to be the official campaigns, which apparently use a lot of scripting and drag it down to 17fps at times.

    BS2 never goes above about 40fps and drops to about 25fps or lower whenever there's several other plane/helo models in view, such as flying towards an airfield and again is only using about 40% of the GPU most of the time. Being a 64-bit game, it was also disappointing to find it only uses about 2GB of my 16GB RAM, so I made a 11GB RAMDisk with it instead to make it load faster and eliminate the stuttering/jitters.
  • 0 Hide
    jonboy79 , 27 January 2013 17:19
    Fx 6300 needed to be in there too.
  • 0 Hide
    wild9 , 2 February 2013 21:31
    [OT] Is there any way to disable to auto video playback on Tom's? It's having quite an impact on my system resources and as such I'm far less inclined to open multiple reviews. For instance I have 5 reviews open and my dual-core Athlon 64 rig is pulling 60% resources under Opera. Other sites work fine.

    To be honest its' rather irritating when you have seen the video more than once, and it keeps on opening every time you just want to read a review. I respect the need to raise advertising revenue especially in these difficult times, I just find the way the video content automatically loads to be somewhat frustrating.

    Thank you.
  • 0 Hide
    keyholder , 11 February 2013 06:09
    You say "But nobody games at 1920x1080 " dont they ...

    I know for one that i do. Yes i have 2 x 7950s in c/f. do i want more than one monitor no... Do i want to game at 5760 x 1080.. err no i dont..

  • 0 Hide
    Xerpadon Xerilious , 11 April 2013 09:57
    The setups are wrong, Sabertooth 990FX is a Pci-e 2.0
    and the Sabertooth Z77 is a Pci-e 3.0
    For equal comparison they needed the Sabertooth 990FX R2.0 Gen3.
    the difference from the Sabertooth 990fx (B.1604) and Saberooth R2.0 Gen3 (B.0305)
    is about 10%~20% (it differs a lot per website reviews in, Games, Resolutions)
    Not to be a conspiracy nut but the reviews should have known this from the start.
    not sure if they are going down hill with people or if they made an honest mistake.