Radeon HD 5770 And 5750 Review: Gentlemen, Start Your HTPCs

The last 30 days have seen a ton of new technology, from Intel’s Lynnfield-based Core i5 and Core i7 processors (which we reviewed here, tested in a number of different games with CrossFire and SLI setups here, and measured the effect of integrated PCI Express 2.0 right here) to ATI’s Cypress graphics processor (manifest through the Radeon HD 5870 and Radeon HD 5850). Between those launch stories, I’ve run thousands of benchmark numbers and written tens of thousands of words. Thus, when I sat down to write this Radeon HD 5770/5750 review (after running another 500+ tests), I had to mix it up a bit and have a little fun with the intro. Feel free to read while listening to Biz Markie’s Just A Friend.

Have you ever seen a card that you wanted to buy?
Killer performance, but a price sky-high?
Let me tell you a story of my situation;
I game on PCs, forget Playstation.
The tech that I like is really high-end.
But I gotta get by with a couple Benjamins.
I upgrade once a year, whenever I can.
Processors, hard drives, graphics cards, RAM.
i7 looked great; I bought i5.
Now it’s time for new graphics; make my games look live.
I know of Nvidia; I know ATI.
So many boards between ‘em, makes me want to cry.
G92’s been around
, and that’s a fact.
Couldn’t find 740
; that launch was whack.
But I’ve pulled out my wallet out and I’m ready to buy.
I want something new; no shrunken die.
Read Chris’ Cypress story
; that card looked hot
If I had four bones, it’d already be bought.

Come onnnnnn, I can’t even afford that.
I’m looking for something under $200, man.

And here’s where ATI chimes in…

We’ve…we’ve got what you need. And you say you have $160 to spend?
And you say you have $160 to spend? Oh gamer…
We’ve…we’ve got what you need. And you say you have $160 to spend?
And you say you have $160 to spend? Oh gamer…
We’ve…we’ve got what you need. And you say you have $160 to spend?
And you say you have $160 to spend?

Last Year’s Flagship Is This Year’s Mid-Range

If the Radeon HD 5870 was characterized by roughly twice the computing resources as Radeon HD 4870, then the Radeon HD 5770 represents a halving of Radeon HD 5870. You’d think that’d yield something that looks a lot like the Radeon HD 4870 to which you’re already accustomed—and you’d be close to correct.

The Radeon HD 4870 is based on ATI’s 55nm RV770, sporting 956 million transistors on a 260 square millimeter die. It boasts 800 ALUs (shader processors), 40 texture units, a 256-bit memory interface armed with GDDR5 memory (cranking out 115.2 GB/s), and a depth/stencil rate of 64 pixels per clock.

In contrast, ATI’s 40nm Juniper GPU is made up of 1.04 billion transistors. It also wields 800 shader processors, 40 texture units, and a depth/stencil rate of 64 pixels per clock. But its memory interface, being a halved version of Cypress,’ is only 128-bits wide. Nevertheless, ATI arms it with GDDR5 memory able to move up to 76.8 GB/s.

Right off the bat, we knew that this was going to be a very tough comparison—not only between ATI and Nvidia, but also between ATI and its own lineup of products. Yes, both of these new cards leverage DirectX 11 support. They both offer three digital display outputs split between DVI, HDMI, and DisplayPort connectors. And the pair is able to bitstream Dolby TrueHD and DTS-HD Master Audio from your home theater PC to your compatible receiver via HDMI 1.3, too.

But with specs that look roughly on par with the Radeon HD 4870 and Radeon HD 4770, anyone who recently purchased one of those previous-generation boards is bound to feel smug about the performance we see in this write-up—at least until DirectX 11 applications start emerging in greater numbers.

So, what’s the verdict? Is the Radeon HD 5770 worth paying $160 for amongst $145 Radeon HD 4870s? Is the 1GB Radeon HD 5750 worth its $129 price tag in comparison to the $120 Radeon HD 4770 (with 512MB) or even Nvidia’s GeForce GTS 250 at a similar price? Let’s dig into the speeds, feeds, numbers, and multimedia tests for more.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • onijutsu
    on the test setup page;
    "Corsair Dominator 4GB (3 x 2GB)"

    shouldn't it be 2 x 2GB?
  • onijutsu
    shouldn't these cards be able to overclock well, considering how energy efficient they are?
  • Anonymous
    Where's my 5830?
  • eskimo_1
    The procesor bottle neck test you presented is just horrible!
    For reasons readers will never know is why you left out AA in some of the bottle neck tests but added AA in some the previous benchmarks. For example on page 8 testsing farcry2 had test with AAx4 but not on the CPU bottle neck test, which is what you SHOULD be tsting for!
    AA/AF uses the CPU where the graphics card can't handle it and for whatever reason you will come up with to answer me, i think you are simply trying to advertise the Core i5-750 CPU as some sort of incredible cpu that doesn't need overclocking. Are you being paid to do this on purpose are are you just really that shit at making decent review?
  • Anonymous
    umm AA/AF is done purely on the graphics card.
  • Anonymous
    So you're saying that someone forking out small money for a mid range gfx card sees the value in supporting three monitors? Sorry but I really can't see that proposition!

    And the DX11 support to me seems a red herring too. Bit-Tech gave a more luke-warm reception and I have to say that I agree with them. DX11 is hardly a sell if the card can only just keep up with the upper-mid range in DX10? Its not like DX11 is going to increase FPS.....

    Bitstreaming? Don't know enough to comment!
  • Reynod
    Thanks Chris.

    Another good article.

    Although Intel won't want to hear your last sentence I am sure AMD wins either way.

    I think that was eskimo's point .. .wasn't it? Be happy with the Q3 earnings report.

    We have long know that money spent on better graphics (or a second graphics card in SLI / CF) once you have a quad core runnng at around 3Ghz (Intel i7 or AMD Phenom II) gives much better value return for gaming than anything else.
  • shrex
    wheres the overclocking?
  • shrex
    owenand Its not like DX11 is going to increase FPS.....Bitstreaming? Don't know enough to comment!

    There's already battleforge out there that renders using dx11 subroutines, etc thats showing increase in fps over the dx10 version. So technically, its a game that's taking advantage of a dx11 feature to get higher fps.

  • zsolmanz
    @ Anonymous 13/10/2009 16:37

    Perhaps some of the people "forking out small money for a mid range gfx card" are in the market purely for a 3-monitor setup that previously they'd have to use multiple cards for? I don't know.

    I would buy a 5770 for an HTPC (the idle power rating / core clock look pretty good to me) but not for anything else.
  • Sunderas
    onijutsushouldn't these cards be able to overclock well, considering how energy efficient they are?

    Yup. They should.
  • Anonymous
    Worth noting ATI cards seem to have a lot of problems with decoding video streams - in particular there is an open problem with Win7 MCE and BBC HD over satellite.
  • welshmousepk
    @ anon

    im pretty sure one of the big selling points of dx11 is how much more efficent it is, and how it can offer incresed performance with a higher level of visual fidelity.

    so having a dx11 card will most likely prove to allow for higher framerates than a game run with a higher spec dx 10 only card.
  • luigi99
    I bought this card. My x1950 bit the dust in my last move, and I need a replacement under $200. My mobo only supports a single card, and this is probably the only upgrade I'll do for 2 years or more. At that point, my CPU will likely be a bottleneck (core 2 duo 2.1ghz, overclockable to 2.8 or so).

    My thoughts: Of single card setups under $200, this one seems like the best long term value. Especially overclocked, it's as powerful as anything else at this price point, and it has DX11 support. It killed me that I bought my x1950, only to have DX10 come out and instantly obsolete my card... the 5770 won't have that problem. I recognize that I'm giving up a few fps in the short term, but in the long term... this card will be competitive for a longer time in its field.
  • oscarebest
    i have a 5750 and i have oc it to gpu clock 750 and memory clock to 1250 when the card is at 97 % activity the card gets up to 64C! is that dangerous and can destroy my card? :O
  • w33dg0d
    The cards are entirely overclockable dependant on your current CPU/PSU set-up. I would advise buying an aftermarket cooling system also as the card can get incredibly hot when you unleash its potential! I myself have a 5770 and agree that it is a powerful card when combo with a good processor (e.g i5 or higher) excells in performance. And as for the price tag, its also very nice to look at for around £100 for the card (Even lower for the 5750) it is a very wise investment to make for any gaming enthusiast that doesn't want to rape their wallet.