What does it mean to run face-first into a bottleneck? When we talk about bottlenecks here on Tom's Hardware, we're usually referring to a single component that's preventing the rest of a PC from reaching its full performance potential in any given benchmark. For games, that component is usually either the CPU or graphics card, depending on how much performance the other part provides.
Our System Builder Marathon machines often expose CPU limits when multiple graphics processors are combined, but most gamers begin their builds with only a single GPU. Among these, AMD’s $320 Radeon HD 5850 represents the highest performance most gamers will want to spend money on. The argument, of course, is that as you start shopping for more expensive alternatives, like a $700 Radeon HD 5970, consoles start looking a lot more attractive.

With our best bang-for-the-buck graphics card fairly well defined, the question becomes: how much CPU do we need to milk the last ounce of performance from this pixel-spewing beast? Would a dual-core CPU do the job or, given that today’s games are ever-more multi-threaded, would a triple- or even quadruple-core processor be needed? How much could overclocking help? Must we spend all of the money saved on the CPU to purchase a big cooler? Knowing that all of the subsequent questions must be addressed to completely answer the first, we gathered our Intel and AMD processor samples and began testing.
- Opening The CPU Bottleneck
- Two $350 Platforms
- Is Overclocking Needed?
- Test Settings
- Benchmark Results: 3DMark Vantage
- Benchmark Results: Call of Duty: Modern Warfare 2
- Benchmark Results: Crysis
- Benchmark Results: DiRT 2 Demo
- Benchmark Results: Far Cry 2
- Benchmark Results: Tom Clancy’s H.A.W.X.
- Benchmark Results: S.T.A.L.K.E.R.: Call of Pripyat
- Power And Efficiency
- Conclusion
Following the links within this and the other articles and comparing across them gives a really great picture of just where and why differant CPU's stand in the order of things.
I think you hit the performance levels bang on and would see people complaining as nit picking.
Mactronix
I reached this conclusion a couple of years or so ago, in the days of the first Stalker instalment where my friend and myself purchased the latest and greatest card of the time the x800xt.
Frame rates were sadly disappointing at the higher settings and we wondered why we should have paid such a high cost for a card (around £300.00) at the time for something that simply didn't cut the mustard.
If you pay top dollar for a card then you should be rewarded with top frame rates and glorious resolutions and of course, ultra high settings in any game.
Sadly this situation remains to this day.
The fault of course lies with the game makers who are perhaps overstepping their bounds and introducing quality settings which are impossible to achieve with the hardware provided giving a glimpse of what could be, but under normal circumstances, never will.
Yes, if you spent a couple of grand on your rig, you're certainly getting there but at what expense in both noise and power costs. It simply isn't worth it.
What's the use of ultra realism anyway? Games are a fantasy world where realism doesn't count in my books, WOW plays and looks great on my laptop, it's a fantasy and I don't want to see it in any other way, those graphics are plenty good enough.
"if you want to seriously game, then buy a console"???
You bought an X800XT "a couple or so years ago" for £300, I bought my 8800GT in 2007 for less than £200, so either you bought longer ago than you can remember or you paid a silly price for an older card! The cheapest recommended gaming card in the monthly charts is the 4650, retailing for less than $50 (£35) and it ranks as faster than your old X800XT.
Wikipedia says "The Radeon X800 "R430"-based 110 nanometer series was introduced at the end of 2004"
XBOX360, released Nov 2005, is based on ATI R520, equivelant graphics card on a PC is the X1900. It has the graphics power of a chip released in 2005 and it always will have.
PS3, released Nov 2006, Graphics are based on NV47 Chip (Nvidia GeForce 7800 Architecture).
Consoles fly in the face of moores law and keep gamers trapped in time. The graphics quality is just not comparable to modern GPU & CPU combinations, even when you have programmers optimizing code for the specific console versus the less than refined coding for a broad range of hardware that PC coding requires.
The Graphics and CPU horsepower of PC gaming and the widespread uptake of High Definition TVs/monitors really does leave consoles struggling to keep up.
I'm afraid my friend, you are talking out of your backside!
It was obvously longer than a couple of years ago, I simply cant be bothered to dig out receipts, boxes or any other sad memories with inefficient overpriced graphics cards.
I simply state the point that Game manufacturers should code games for the relevant technology of the time, not for games a year or two in the past.
If you are and have been satisfied with your purchase then good for you, I'm not knocking that, anything that floats your boat, etc.
but to charge an exorbitant sum for a card that in my mind, underperforms is no on.
Perhaps the avid gamer tries to kid himself that the framerate in front of him is acceptable, perhaps the eye deceives him, perhaps in his delusional state he doesnt see or feel through the mouse the effort that the struggling card is enduring, I dont know. All I do know is that you simply dont get enough bang for your buck, not without paying out a small fortune.
As for consoles, you know full well they have limitations and for the price you pay, you dont expect ultra smooth ultra high performance, and because of that, the game is much more enjoyable.
I'm not denying that consoles are better value for money, but they're built for the lowest common demoninator which still leaves a lot of us unhappy/unable to use them.
Time has moved on, I don't own a ultra GPU in fact I have an Ati 4870, a simple dual core E8400 nothing cutting edge or over priced there, yet my PC will trash to high heaven all console games, for resolution, detail and frame rate.
You can't say to seriously game you need to buy a a system (in this case a console) which is years old and is obviously far inferior to the current market even in mid range specs, a budget PC will step all over consoles these days.
You may love your console, so do my kids, but it does not compare to a PC in any way.
People who pay $300 for a GPU are enthusiast gamers and want and appreciate the nuances involved in setting the experience up to the best they can. have you seen a game like Oblivion played on a console and then seen the same game played on an optimised PC system ? I seriously doubt it or you wouldnt have posted what you did.
What i think Toms needs to do now is go at it the other way, run teh same tests but run a 5830 and 5770 also include a 4870 and 4850. Nvidia cards to suit as well of course and that way we can see whare the sweetpoint is.
This has shown us where CPU upgrading starts to flatten out but do you really need a 5850 to get playable framerates at high settings ?
Mactronix
I imagine by sticking with the existing resolutions 1680x1050, 1980x1200 & 2560x1600, readers of the reviews can directly compare against previous reviews. Replacing the 1980x1200 with the more common 1980x1080 resolution reduces the resolution by 10% and the extra performance that gives will skew any results so you wouldn't be able to compare like with like.
A review of 1 card on its own isn't particulalry useful, you need to have direct comparisons to be sure of the relative performance.
I disagree, what this article shows us is that these games were limited by the GPU, so the CPU wasn't a bottleneck. I think you'll find that with weaker graphics, all you'll get is a whole bunch of results with very little difference between the CPUs.
If they want to see more of a difference with CPUs, they need to get rid of the graphics bottleneck (possibly with a 5970 instead of a 5850), or show benchmarks for games that are more CPU intensive, such as GTA IV, L4D2 or strategy games.