Sign in with
Sign up | Sign in

Gaming Shoot-Out: 18 CPUs And APUs Under £160, Benchmarked

Gaming Shoot-Out: 18 CPUs And APUs Under £160, Benchmarked
By

Now that Piledriver-based CPUs and APUs are widely available (and the FX-8350 is selling for less than £160), it's a great time to compare value-oriented chips in our favorite titles. We're also breaking out a test that conveys the latency between frames.

At least on the desktop, dual-core processors rarely helped bolster performance when they were first introduced. Most mainstream apps simply hadn't been optimized for multiple cores; that sort of technology was principally enabled in the server and workstation space. You had multi-socket motherboards with single-core chips cranking on complex problems in parallel. But games were almost exclusively written to run on a one core.

Programming with threading in mind isn't easy, and it took developers years to adapt to a world where CPUs seemed destined to improve performance through parallelism rather than then 10 GHz clock rates Intel had foreshadowed back in 2000. Slowly, though, the applications most able to benefit from multiple cores working in concert have been rewritten to utilize modern hardware.

Want proof? Just have a look at our benchmark suite. We test something like two pieces of software that are still single-threaded: Lame and iTunes. Everything else, to one degree or another, is threaded. Content creation, compression, and even productivity apps tax the highest-end four- and six-core CPUs.

Games, on the other hand, have taken longer to "get there." With a primary emphasis on graphics performance, it's not surprising that single-threaded engines still exist. However, spawning additional threads and utilizing a greater number of cores allows ISVs to implement better artificial intelligence or add more rigid bodies that can be affected by physics.

Increasingly, then, we're seeing more examples of games exhibiting better performance when we use quad-core processor. They're still the exception though, rather than the rule. And that's why the great single-threaded performance of Intel's Sandy Bridge architecture (and later Ivy Bridge) dominated most of our processor-bound game testing. Back in the day, dual-core Pentiums went heads-up against quad-core CPUs from AMD, and came out in the lead.

It's now clear that gunning for higher and higher clock rates is not the direction AMD and Intel are going. They're both building desktop-oriented CPUs with as many as four modules (in AMD's case) or six cores (in Intel's). In turn, game developers continue getting better about utilizing available on-die resources. We're clearly at a point where you need at least a dual-core CPU to enjoy today's hottest titles, if for no other reason than sticking with a single-core chip would put you about eight years back in processor technology. But is there a reason to skip over the dual-core models and jump right into the world of gaming on a quad-core CPU?

That's what we're hoping to answer today, and we have a new tool to help us.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 15 comments.
This thread is closed for comments
  • 1 Hide
    Kamen_BG , 12 February 2013 15:10
    At first i found it wierd that Tom's reported the Pentium G860 to be faster than the Phenom II 980 at gaming.
    Now i wonder how there's practicly no difference between the Zambezi and Vishera CPU's.
  • 3 Hide
    Flying-Q , 12 February 2013 15:29
    Quote:
    the 75th percentile result shows us the longest lag between consecutive frames that we see 75 percent of the time, and so on with the 95th percentile

    No, no, no, no, NO!

    All the values (of latency in this case) are set on a range centred on the 50th centile (where you can expect 50% of the population (of data) to be below and 50% above. At the 75th centile value 75% of your population will be below that value and 25% will be above. At the 95th centile 95% of the population will be below and only 5% above. This works for the 5th and 25th centiles likewise. The metric you should be focusing on is the range. i.e. what is the spread of the data. For example, for visible micro-stuttering a graphics setup would manifest with a large range between the 25th and 75th (often called the inter-quartile range, within which half of your data resides) and an even larger range (obviously) between the 5th and 95th centiles. This wide spread would show the large variation in rendering times for consecutive frames.

    On a system where there is little or no micro-stuttering you would expect the inter-quartile and 5th-95th ranges to be narrow indicating very regular frame rendering times. Further to this, if the inter-quartile range is small but the 5th-95th is large that would indicate that most frames are rendered well and consistently but there are occasional large out-liers. These out-lier values would be perceived as micro-stuttering as well.

    Please Don, when you delve into statistical analysis, be precise with your descriptors. Otherwise it detracts from your, clearly extensive, knowledge and experience in the computing world.

    Q.

    Edit for grammar.
  • 2 Hide
    SchizoFrog , 12 February 2013 16:13
    You can wrap it up as much as you want but a £90 Intel CPU (i3-2120 or the i3-3220) still batters an AMD chip that goes for over £150. Well done to Intel. It also has to be said that it's been a while since AMD have seriously challenged Intel and yet Intel still manage to improve development and performance while keeping the end cost of the product at around the same level. £90 for a CPU that can tear through games at this resolution (1920x1080) is great news for gamers and you don't have to break the bank to go that little further in to the high end performance zone for the i5.
  • 1 Hide
    HEXiT , 12 February 2013 16:18
    cleary there was never that much difference regarding performance at sub 200 and the 8350 is a good improvement when over clocked. who knows maybe the old testing suite was a little intel bias. it certainly looks a lot more balanced now...
    which is a good thing for the consumer as it will stop intel from becoming complacent. fanboi it all you want, but we need competition in the market or the end user suffers.
  • 0 Hide
    AndrewJacksonZA , 12 February 2013 17:42
    I know that you're exclusively testing CPUs and not the whole ecosystem, but it struck me as I read this last page that one would not, in real life, use a CPU in isolation for serious game, or pair a super-expensive GPU to a budget CPU.

    What would the results have been if you used a little 6670 with the CPUs? Would the AMD CPUs have performed better because of the ability to Crossfire with their APUs where appropriate?

    Disclaimer: I am running an Intel E6750 @ 2.66GHz and an AMD 6670 at home on my gaming PC and I'm happily gaming all my games at maximum detail levels at the maximum resolution my monitor supports which is 1280 x 1024.
  • 1 Hide
    jakjawagon , 12 February 2013 18:55
    Would have liked to see some older CPUs here. I use an i7 930, and would be interested to see how its frame latency compares to newer architectures. It's from 2010, same as the Athlon X3 used in the article.
  • 0 Hide
    bemused_fred , 12 February 2013 21:55
    AndrewJacksonZAI know that you're exclusively testing CPUs and not the whole ecosystem, but it struck me as I read this last page that one would not, in real life, use a CPU in isolation for serious game, or pair a super-expensive GPU to a budget CPU.What would the results have been if you used a little 6670 with the CPUs? Would the AMD CPUs have performed better because of the ability to Crossfire with their APUs where appropriate?Disclaimer: I am running an Intel E6750 @ 2.66GHz and an AMD 6670 at home on my gaming PC and I'm happily gaming all my games at maximum detail levels at the maximum resolution my monitor supports which is 1280 x 1024.


    This has already been done.

    http://www.tomshardware.co.uk/fx-4100-core-i3-2100-gaming-benchmark,review-32384.html
  • 0 Hide
    SchizoFrog , 12 February 2013 23:14
    jakjawagonWould have liked to see some older CPUs here. I use an i7 930, and would be interested to see how its frame latency compares to newer architectures. It's from 2010, same as the Athlon X3 used in the article.

    This is a sub $200 comparison and the i7-930 and the i7-920 that it replaced were both around the $300 mark. It's well documented that those CPUs offered further performance gains over the i5 chips but at a diminishing value return creating a lower bang per buck ratio.
  • 1 Hide
    SchizoFrog , 12 February 2013 23:19
    HEXiTcleary there was never that much difference regarding performance at sub 200 and the 8350 is a good improvement when over clocked. who knows maybe the old testing suite was a little intel bias. it certainly looks a lot more balanced now...which is a good thing for the consumer as it will stop intel from becoming complacent. fanboi it all you want, but we need competition in the market or the end user suffers.


    That makes no sense. You can't say there was not much difference AND that the 8350 is an improvement while it is still severely lacking against CPUs of the same price and is often matched if not beaten by the much cheaper i3 chips.
  • -1 Hide
    blobby91 , 13 February 2013 03:21
    Why would you buy an AMD APU then pair it with a GTX 680?
  • 2 Hide
    SchizoFrog , 13 February 2013 06:49
    blobby91Why would you buy an AMD APU then pair it with a GTX 680?

    They use the GTX680 to remove any possibility of performance bottlenecks associated with a GPU so we get true CPU results.

    The only reasons these days to buy an AMD CPU is to upgrade your current AMD system or to build a budget gaming rig or a HTPC using one of their latest APUs without using a dedicated GPU. If you are building a new system with a dedicated GPU then Intel is the obvious choice.
  • 0 Hide
    jakjawagon , 14 February 2013 07:29
    SchizoFrogThis is a sub $200 comparison and the i7-930 and the i7-920 that it replaced were both around the $300 mark. It's well documented that those CPUs offered further performance gains over the i5 chips but at a diminishing value return creating a lower bang per buck ratio.

    Ok, perhaps something cheaper from the same generation/architecture.
  • 0 Hide
    swamprat , 14 February 2013 19:41
    I'd be quite interested in seeing the type of analysis done on the hybrid crossfire that's meant to be possible with the APUs. I almost went for an AMD laptop but baulked at the last moment and one of the factors behind that was someone complaining about microstuttering.
  • 0 Hide
    Blahman11 , 15 February 2013 01:00
    Well it'll be interesting to see what happens when the next gen consoles come out. Rumours are that the PS4 and/or the new xbox is gonna be using AMD's bulldozer (or piledriver dunno which) design for their CPUs. Might this mean that the console ports will play better on AMD machines, as it's closer to the console's architecture?
  • 0 Hide
    Anonymous , 15 February 2013 23:31
    Can you test on AMD Radeon HD 7970 to see if driver is bottlenecking G860 ?