Sign in with
Sign up | Sign in

Radeon HD 5850: Knocking Down GTX 295 In CrossFire

Radeon HD 5850: Knocking Down GTX 295 In CrossFire
By

The Radeon HD 4870 1GB sure dropped to $150 pretty quickly, didn’t it? The Radeon HD 4890 really isn’t all that far behind at $190 (as low as $170 with mail-in rebates). So, for the Radeon HD 5850 to be a success at $259, it’d better be appreciably quicker, right?

Nvidia has its own high-end bruisers around the same price range, too. A GeForce GTX 275 at $210 is mighty tasty. And a GTX 285—the company’s fastest single-GPU board available—isn’t bad at $330 or so given its single-GPU flagship status (less than $300, after some of those rebates).

If you haven’t yet checked out our review of the Radeon HD 5870, you might want to give it a quick peek. After all, the Radeon HD 5850 under our microscope today centers on the same fundamental architecture as that board (and I don’t think I can swing another 10,000 word story this week, so this piece isn't going to cover all of the GPU nuances).

Answering The Lynnfield Question

But that doesn’t mean we can’t break some new ground. One of the criticisms I saw come up in the comments section was that we used a $1,000 processor overclocked to 4 GHz for testing AMD’s Radeon HD 5870. Of course, that configuration was by design. These new GPUs are so powerful that we wanted to give them as much room to “breathe” as possible, without seeing congestion in the benchmarks due to processor bottlenecks. This presents a bit of a theoretical question to the folks running Core i7, LGA 1156-based Core i7s, a Core 2-series chip on P45, or 790GX. Mainly, does the move from a x16 PCI Express 2.0 slot to a x8 connection affect the performance of such a powerful GPU.

In order to help answer that, we took our Core i7-870 and overclocked it to 4 GHz on Asus P7P55D Premium. I suspect that in a Lynnfield-based configuration, the Radeon HD 5870 will be a less-popular choice than the cheaper Radeon HD 5850, so we tested a pair of 5850s on both Intel-based platforms to shed a little light on this one.

We also dropped the Core i7-870 to its stock speeds in order to isolate the effect of processor performance versus our overclocked Lynnfield-based results.

Is CrossFire Worth It?

When ATI launched the Radeon HD 4770 at $110, we couldn’t pass up the opportunity to compare two of the cards together against Radeon HD 4890s and GeForce GTX 275s. But the Radeon HD 5850 is not a cheap piece of hardware at $259. A pair costs just under $520. The only single card in that neighborhood is a GeForce GTX 295, which can be found for roughly $500 and hasn’t yet been discounted, despite AMD’s Cypress launch. To be fair, there isn’t yet a need, as the 5870s are still in extremely limited supply, so the challenge seems to be getting your hands on one.

But maybe the Radeon HD 5850 will change that. Today, we’ll be looking for a single Radeon HD 5850 to stand up to Nvidia’s GeForce GTX 285—a card AMD couldn’t contend with using a single-GPU solution in the past—and a pair of 5850s to at least eke past the GeForce GTX 295, priced similarly.

Display 19 Comments.
This thread is closed for comments
  • 1 Hide
    chispas , 30 September 2009 15:43
    It's about time video card technology became smaller and more efficient without compromising advancements in speed. It's bad enough you even need to consider buying two cards in this day and age - Voodoo 2 SLI was never cool, in 2009 it still isn't.

    As long as ATI continues to make high-performance graphics cards which can still fit sensibly (9.5") inside smaller pc cases, Nvidia are sure to lose the technology war. I certainly hope Nvidia aren't planning any 19" long cards!

    Crysis @ 1680 @ High @ 50 frames w/ 1 gfx card = how it should be.
  • 2 Hide
    LePhuronn , 30 September 2009 18:15
    chispasCrysis @ 1680 @ High @ 50 frames w/ 1 gfx card = how it should be.


    Yet it won't be as long as game developers create engines that are capable of outstripping the capabilities of single GPUs pushing the hardcore gamer into multi-GPU setups. But given that GPU manufacturers now have viable multi-GPU technology they will push that hard to sell more cards, thus allowing game developers to keep pushing the upper limits of their engines.

    Now if the hardcore gamer stopped buying muti-GPU systems then the above cycle will end, but they won't - either through pride or stupidity the hardcore gamer will say "but to run this at maximum I need 3 GTX-whatevers" and will go and buy them.

    In a couple of years single cards WILL be able to play Crysis as you say - there were many similar complaints about Elder Scrolls' requirements when it came out, but it's hardly an issue now - abut frankly I'm tired of people using Crysis as that measuring stick - it's totally unrealistic! Either through optimisim or pure stupidity, the Crytek engine used is just beyond any sensible levels of technology we have at the moment. I thought Doom 3 on Ultra quality was a bad idea when it first came out and then tech caught up, but I'm not sure we'll see that with Crysis any time soon.
  • 0 Hide
    sirkillalot , 30 September 2009 20:16
    we need more quite, less power hungry and cooler gpus keep them coming :) 
  • 0 Hide
    sirkillalot , 30 September 2009 20:17
    spell check quiet .. my bad
  • 0 Hide
    lapoki , 30 September 2009 20:29
    I wonder what hardware do the guys at Crytek use to make a game like crysis...
  • 0 Hide
    LePhuronn , 30 September 2009 22:04
    lapokiI wonder what hardware do the guys at Crytek use to make a game like crysis...


    I don't think it was tested at maximum resolution and effects. They probably had cutting-edge hardware, built so it was playable at maximum settings and then turned up the dials of what the engine could do visually - at that point you only need to do stills to test the effects, or pre-render sequences as video to test it out.
  • 0 Hide
    redkachina , 1 October 2009 00:19
    These new cards make 4890 looked like a cheap card LoL - luckily I waited before buying 4890 / 4870x2..
  • -1 Hide
    LePhuronn , 1 October 2009 01:19
    As good as they are, I'm holding off for NVIDIA's cards or getting GTX285s on cheap (as in Overclockers are doing a self-branded GTX285 for £200 cheap) - unless the companies who produce the design software I use embrace ATI's stream processors I'm afraid it's CUDA all the way with me.
  • 0 Hide
    tony_wilson , 1 October 2009 12:19
    Since I rarely do anything other than basic functions on my comp and just love playing PC games on over consoles. I'm glad to find out that I can buy a nice Mobo for $150, $200 CPU, and get a decent gaming rig. Can anyone please explain to me all the hype around overclocking as far as games are concerned. Or is it mainly for applications that are CPU intensive.
  • 1 Hide
    LePhuronn , 1 October 2009 17:10
    tony_wilsonCan anyone please explain to me all the hype around overclocking as far as games are concerned.


    It's not hype, it's basic maths: if you have something that goes X fast but is capable of going Y faster then pushing it to Y will get things done faster than X.

    Doesn't matter if it's games, video encoding, finance calculations or anything - if your chip can be overclocked to get the job done faster then why not do it?

    Also don't think that games can't CPU intensive: Crysis for instance has a purely software-driven sound engine that will be processed by the CPU, and GTA IV is such a bad port it runs even a quad-core really hard.

    The real discussion though is dual-core vs quad-core in games - do you go quad core when most games won't take advantage of it? For the longest time the consensus was always a faster dual-core was better for you, but GTA IV and a few others have now started being quad-aware and that trend is no doubt growing.
  • 0 Hide
    tony_wilson , 1 October 2009 23:50
    Allow me to clarify my question. When saying "hype" I mean popular, not unproven performance. Since the CPU will become inadvertently hotter because of the overclock, I will now have to spend another $50 on a decent cooler like a nice rosewill or something. Am I really going to notice a big difference of 10%-15% when I'm already getting 80-120 fps in games, or good performance in other applications already? Do you see my point? Lokk at the benchmarks above. They sent that thing to 4 GHz! How long will it last for, and does it seem worth such little gain if you have to spend another $50?
  • 1 Hide
    LePhuronn , 2 October 2009 00:54
    ^^ I see what you're saying.

    Usually it's massive voltage increases and/or not dealing with heat properly that causes damage to the CPU - the D0 stepping i7 for instance only needs a tiny voltage boost, if at all, to reach the 3.8GHz+ mark, and keeping the temps to around 75 degrees or less keeps the chip OK. So that being said, overclocking the CPU to these levels won't do it any harm

    The reason for the overclock though in this case is to remove any possible CPU bottlenecks when driving such a powerful Crossfire setup - as the article is about the GPU's performance you want the numbers coming out to be as pure as possible.

    Real-world? Overclocking isn't going to hurt you and if you can and keep it under control, why the hell not? But if you can get the perceived standard 60fps+ out of a CPU at stock then yeah you're probably just splitting hairs or showing off by cranking your CPU up for those extra 20 or 30fps you're never going to see.

    Applications is a slightly different kettle of fish IMO. I do a lot of graphic and video work and I render and encode masses of data. As a result I can never go too fast, and even if I get an extra 10% performance by overclocking I'll take it - 10% off a 60 second render may only be 6 seconds, but if I can shave 24 minutes off a 4-hour render then it's worth it.
  • 0 Hide
    zsolmanz , 2 October 2009 02:59
    Now I have to wonder if my HD4890 was a good buy (with the intention of going Crossfire at some point), considering DX11 is properly 'on the cards'...
  • 0 Hide
    tony_wilson , 2 October 2009 05:34
    I see. Thank you for the response. When I get my i-7 860 I'll try it out.
  • 0 Hide
    mildiner86 , 3 October 2009 18:31
    tony_wilsonI see. Thank you for the response. When I get my i-7 860 I'll try it out.


    great choice of CPU and that one needs to be OC :p  with a £40 cooler u could most likely get to 3.6-3.8 Ghz

    in the past after OC my system i have seen an increase in 30% for 3D mark.

    as long as ur temps and voltage's are fine theres no down side to OC. just dont be tempted to push the CPU to its max in order to obtain that last 2-5% performance boost lol
  • 0 Hide
    LePhuronn , 21 October 2009 07:47
    Why did I get marked down just for giving a genuine, non-fanboi reason for sticking with NVIDIA cards?

    Or is the sheer mention of the green team in a red team article really that much of an insult?
  • 0 Hide
    Redsnake77 , 6 March 2010 04:07
    In your Stalker Clear Sky bench, you say you used Ultra settings. Did this include DX10.1 lighting and Sun detail on Ultra (god rays?)? It's just I've completed a new build, with an i7 930 (2.8-2.93GHz) 1600MHz 6GB ram, and 2 Sapphire HD 5850 Toxics in Crossfire, and at 1920 X 1200 it barely gets into double didget frames per second. Catalyst 10.2 being used. Any help would be appreciated.
  • 0 Hide
    zsolmanz , 20 March 2010 20:56
    What operating system are you using?
  • 0 Hide
    Redsnake77 , 20 March 2010 23:22
    Win 7 64bit, 6GB 1600MHz ram. 2.8GHz (2.93GHz most of the time actually) overclocking to 3.5GHz made absolutely no difference to frames per second.