AMD Radeon HD 6670 And 6570: Turkeys Or Turkish Delights?

Based on the new Turks GPU, AMD’s Radeon HD 6570 and 6670 graphics cards are poised to hit the sub-£80 market. Do these products have what it takes to compete in this fiercely competitive segment, or are AMD's subtle evolutionary changes too small?

For the third time this month, we’re reporting a graphics card launch from AMD. And this time it’s a double-header. Meet the Radeon HD 6570 and 6670:

These new products are a natural evolution of the Radeon HD 5570 and 5670, two very important cards in the sub-£80 graphics card market. For some time, we’ve recognized the Radeon HD 5570 as a realistic ~£55 starting point for budget buyers looking for respectable gaming hardware, and the ~£65 Radeon HD 5670 holds the distinction of being the most powerful reference design that doesn’t require a dedicated PCIe power cable.

To this point, most of the Radeon HD 6000-series cards employ a subtle (but notable) architectural tuning from the 5000-series days, so we expect these new models to be closely related to their predecessors. Let’s see how the Turks GPU stacks up:

This is another offspring of the Barts graphics processor introduced in the Radeon HD 6800 series, which itself was an evolution of Cypress. This one is scaled down to six SIMD engines, though. Each engine is associated with four texture units and is composed of 16 thread processors, with five stream processing units (ALUs) per thread processor. In the case of Turks, that makes for a grand total of 24 texture units and 480 ALUs. Two 64-bit memory controllers deliver an aggregate 128-bit memory interface, and both render back-ends host four colour ROPs, totalling eight.

Turks sports the same features found across the entire Radeon 6000 line: improved tessellation performance, Eyefinity enhancements (these particular models support up to four displays), and Blu-ray 3D decode acceleration. At least for now, you'll find this GPU in two specific configurations: Radeon HD 6570 and 6670. Let’s consider where they sit in the grand scheme of things:


Radeon HD 5570
Radeon HD 6570Radeon HD 6670Radeon HD 5670GeForce GT 240GeForce GTS 450
Shader Cores:
400
480
480
40096
192
Texture Units:
20
24
24
203232
Colour ROPs:
8
8
8
8816
Fabrication process:
40 nm
40 nm40 nm40 nm40 nm
40 nm
Core/Shader Clock:
650 MHz
650 MHz
800 MHz
775 MHz550/1360 MHz783/1566 MHz
Memory Clock:
900 MHz DDR3
900-1000 MHz GDDR5
900 MHz DDR3
900-1000 MHz GDDR5
1000 MHz GDDR5
1000 MHz GDDR5850 MHz GDDR5902 MHz GDDR5
Memory Bus:
128-bit
128-bit128-bit
128-bit128-bit128-bit
Memory Bandwidth:
28.8 GB/s DDR3
64 GB/s GDDR5
28.8 GB/s DDR3
64 GB/s GDDR5
64 GB/s64 GB/s
54.4 GB/s
57.7 GB/s
Thermal Design Power Idle/Maximum (W)
9.7/43 W
10/44 W
11/60 W14/61 W
69 W Max.106 W Max.
Price
~£55
Online
£55
(MSRP)

£75
(MSRP)

~£65
Online
~£65
Online
~£100
Online


From this chart, it's particularly clear that Radeon HD 6570 and 6670 are simple evolutionary steps from Radeon HD 5570 and 5670. Though those cards share the same specifications per SIMD engine, the new models sports six (instead of five), resulting in 80 more ALUs and four more texture units. The Radeon HD 5570 and 6570 share the same clock rates; Radeon HD 6670 gets a 25 MHz core boost over the 5670. As a result, we expect the Turks-equipped cards to demonstrate a lead over their predecessors when it comes to performance, even if the advantage isn't particularly pronounced, given the identical ROP count and 128-bit memory interface.

While we don’t expect much from the similarly-priced GeForce GT 240 (a card with lower performance than the Radeon HD 5670), the GeForce GTS 450 can be found kissing the £85 - £90 mark, promising stiff competition for the new Radeon HD 6670 and its £75 MSRP. With double the CUDA cores and ROPs of the GT 240, Nvidia's GeForce GTS 450 is a serious player. But before we see how these entry-level cards fare in combat, let’s have a look at AMD's Radeon HD 6570 and 6670 reference hardware.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
3 comments
Comment from the forums
    Your comment
  • Zingam
    At a time where most monitors are 1920x1080 putting on top any benchmarks below that resolution is pointless.

    So 30 fps at that resolution in Crysis 2 - not too bad but for me these cards are worthless for any gaming at all. They do not deserve even to be benchmarked and AMD isn't that great in general 2D applications as well.

    I think it is time when GPU designers should think of how to improve and support 1920x1080 as a minimum resolutions. And not just GPU designers but also game developers and OS developers.

    Yeah, some notebooks have lower resolutions but these parts are not notebook parts anyway.

    In the age of CRTs such benchmarks made sense but currently using non-native resolutions on a LCD is ... you know what it is!
  • Solitaire
    Assuming that every panel ever made has a native res of 1080p... you know what it is! :p

    For cheapo 780p+ (1366*768) and 19" panels (1600*900) these cards are very strong budget performers that require sweet f-all juice to run. And even for 1080p HTPC use they're just the ticket for playing older or less demanding games. So please get off the "Y U NO HD7990??!?" high horse...
  • FC360
    Why is the 5770 listed under test setup but isn't listed anywhere else?