With our testing complete, we wanted to plot out average performance at 1920x1080, with Intel's Pentium G860 standing in as our 100% baseline.
What you see below is a very different aggregate chart compared to the one that showed up in Picking A Sub-£160 Gaming CPU: FX, An APU, Or A Pentium?, particularly when it comes to the Pentium. The rest of the results seem like they fall close to where we would have expected them, based on our previous testing. AMD's processors do come closer to Intel's last-gen and Ivy Bridge-based Core i3s. Indeed, the FX-8350, FX-6300, and FX-4300 are nipping at the entry-level Intel chip. The Phenom II X4 and X6 are as well, though neither is available any more. Even quad-core APUs like the A10-5800K and A8-3870K hold their own.
The performance curve starts to fall off pretty quickly once we look at the Pentium G860, Athlon II X3 450, and the two A4 APUs.

It isn't explicitly clear what changed in the last year, since our previous look at processors under at this price range, to affect performance. But we are using some new games, old games that have been patched, new drivers, and a new operating system, so all of that is in play. Regardless, AMD's FX processors, its two-generation-old Phenom II X6 and X4 CPUs, and the company's Athlon II X4 look a little better compared to Intel's Core i3 than they used to. In contrast, the Sandy Bridge-based Pentium G860 falls relative to where it was.
The Pentium isn't bad, to be sure. In fact, for £55, it still does really well against the FX chips we tested that cost £95 and up, use quite a bit of power, and generate significantly more heat. Nevertheless, we see the trend toward more threaded titles continuing, compelling us to start distancing ourselves from dual-core non-Hyper-Threaded CPUs in 2013. At least for the time being, whatever quad-core Athlon II and Phenom II processors that are still available seem like smart buys.
Once those dry up, what then? Intel still holds the aces. For your pound, the Core i5 has no competition above £100 and the Core i3-3220 is tough to beat. It no longer humiliates the FX line-up in games thanks to AMD's most recent architectural update, but it's still cheaper, faster, and more power-friendly than most of the Vishera-based models.
Fortunately for AMD, its chips fare better in the non-gaming components of our benchmark suite, where its modular architecture is better able to benefit from today's threaded software. In a general-purpose workstation, that's certainly something to think about. But in a pure gaming machine, there's just no ignoring the effectiveness of Intel's Sandy and Ivy Bridge designs.
Now i wonder how there's practicly no difference between the Zambezi and Vishera CPU's.
No, no, no, no, NO!
All the values (of latency in this case) are set on a range centred on the 50th centile (where you can expect 50% of the population (of data) to be below and 50% above. At the 75th centile value 75% of your population will be below that value and 25% will be above. At the 95th centile 95% of the population will be below and only 5% above. This works for the 5th and 25th centiles likewise. The metric you should be focusing on is the range. i.e. what is the spread of the data. For example, for visible micro-stuttering a graphics setup would manifest with a large range between the 25th and 75th (often called the inter-quartile range, within which half of your data resides) and an even larger range (obviously) between the 5th and 95th centiles. This wide spread would show the large variation in rendering times for consecutive frames.
On a system where there is little or no micro-stuttering you would expect the inter-quartile and 5th-95th ranges to be narrow indicating very regular frame rendering times. Further to this, if the inter-quartile range is small but the 5th-95th is large that would indicate that most frames are rendered well and consistently but there are occasional large out-liers. These out-lier values would be perceived as micro-stuttering as well.
Please Don, when you delve into statistical analysis, be precise with your descriptors. Otherwise it detracts from your, clearly extensive, knowledge and experience in the computing world.
Q.
Edit for grammar.
which is a good thing for the consumer as it will stop intel from becoming complacent. fanboi it all you want, but we need competition in the market or the end user suffers.
What would the results have been if you used a little 6670 with the CPUs? Would the AMD CPUs have performed better because of the ability to Crossfire with their APUs where appropriate?
Disclaimer: I am running an Intel E6750 @ 2.66GHz and an AMD 6670 at home on my gaming PC and I'm happily gaming all my games at maximum detail levels at the maximum resolution my monitor supports which is 1280 x 1024.
This has already been done.
http://www.tomshardware.co.uk/fx-4100-core-i3-2100-gaming-benchmark,review-32384.html
This is a sub $200 comparison and the i7-930 and the i7-920 that it replaced were both around the $300 mark. It's well documented that those CPUs offered further performance gains over the i5 chips but at a diminishing value return creating a lower bang per buck ratio.
That makes no sense. You can't say there was not much difference AND that the 8350 is an improvement while it is still severely lacking against CPUs of the same price and is often matched if not beaten by the much cheaper i3 chips.
They use the GTX680 to remove any possibility of performance bottlenecks associated with a GPU so we get true CPU results.
The only reasons these days to buy an AMD CPU is to upgrade your current AMD system or to build a budget gaming rig or a HTPC using one of their latest APUs without using a dedicated GPU. If you are building a new system with a dedicated GPU then Intel is the obvious choice.
Ok, perhaps something cheaper from the same generation/architecture.