Sign in with
Sign up | Sign in

Display Outputs And AMD's Tessellation Coup

AMD Radeon HD 6990 4 GB Review: Antilles Makes (Too Much) Noise
By

Eye See You

Given the Radeon HD 6990’s brute force approach to performance and cooling, we’re happy to see that elegance didn’t go completely ignored. Using a single slot worth of the I/O bracket, AMD exposes an unprecedented five display outputs: one dual-link DVI and four mini-DisplayPort connectors. The retail Radeon HD 6990 will ship with a trio of adapters for more diversity, including one passive mini-DP-to-single-link DVI, one active mini-DP-to-single-link DVI, and one passive mini-DP-to-HDMI. AMD calls these a roughly $60 value.

I’m a big proponent of multi-display configurations for enhancing productivity, and I currently use a 3x1 landscape configuration. I consider five screens overkill for what I do. But AMD is now pushing a native 5x1 portrait mode that admittedly looks pretty interesting.

Beyond simply working more efficiently, using three or five screens is also a great way to take advantage of graphics horsepower available from a 375+ W dual-GPU card. As you’ll see in the benchmarks, 1680x1050 and 1920x1080 are often wasted on such a potent piece of hardware—even with anti-aliasing and anisotropic filtering enabled.

Just remember, with more than one screen attached to a card like AMD’s Radeon HD 6990, idle power consumption won’t match the figures we present toward the end of this piece. It actually jumps fairly substantially due to the need for higher clocks. With just one screen attached to the 6990’s dual-link DVI output, we observe 148.5 W system power consumption at idle. With a trio attached to the mini-DisplayPort connectors, that figure jumps to 187.2 W.

To be clear, you'll see higher power use from Nvidia cards as well in multi-monitor environments. AMD is the first to explain why this power increase is necessary, though:

"PowerPlay saves power via engine voltage, engine clock, and memory clock switching. Memory clock switching is timed to be done within an LCD VBLANK so that a flash isn't seen on the screen when the memory speed is changed. This can be done on a single display, but not with multiple displays because they can (and in 99% of the cases, will be) running different timings and virtually impossible to hit a VBLANK on both at the same time on all the panels connected (and when we say "timings" it’s not as simple as just the refresh rate of the panel, but the exact timings that the panel's receivers are running). So, to keep away from the end user seeing flashing all the time, the MCLK is kept at the high MCLK rate at all times.

With regard to power savings under multiple monitors, we have to trade-off between usability and power. Because we can't control what combinations of panels are connected to a desktop system we have to choose usability. Other power saving features are still active (such as clock gating, etc.) so you are still saving more power than peak activities. Note, that in a DisplayPort environment we have more control over the timing and hence this issue could go away if all the panels connected where DP."

PolyMorph What?

When Nvidia launched its GF100-based cards (GeForce GTX 480 And 470: From Fermi And GF100 To Actual Cards!), it pushed geometry as the next logical step in enhancing the realism of our games. We saw many compelling tech demos and game engine demonstrations that backed up the company's party line. But I wasn't prepared to give Nvidia a pat on the back until an actual game started shipping with more than a superficial implementation of tessellation, used to actually augment reality. HAWX 2 was the first example of this. I immediately started using HAWX 2 for all measures of tessellation performance in graphics card reviews, and I came away with some interesting conclusions.

First, the PolyMorph engines resident in each of Nvidia's Streaming Multiprocessors didn't seem to scale very well. A GeForce GTX 560 Ti features eight SMs, and consequently eight PolyMorph geometry engines. In comparison, a GeForce GTX 570 employs 15 SMs. Yet, we've already seen that the 570 retains 71% of its performance in HAWX 2 after turning tessellation on. Meanwhile the 560 Ti serves up 70% of its original performance with the feature enabled. That one percent difference screams out that more PolyMorph engines only minimize the impact of using tessellation up to a certain extent.

But at least Nvidia could still point out that AMD's cards shed nearly 40% of their performance with tessellation enabled. Well, it'd seem that a pair of Cayman GPUs cumulatively able to crank out four primitives per clock turns that story on its head. Radeon HD 6990 doesn't impress us with its frankly modest lead over the GeForce GTX 580; it impresses us by retaining 76% of its original frame rate with tessellation turned on. That's better than GeForce GTX 580's 75%. Never mind those 16 PolyMorph engines. It looks like four of AMD's tessellation units do the trick here.

Display all 12 comments.
This thread is closed for comments
  • -1 Hide
    doive1231 , 8 March 2011 15:31
    What about 10 display outputs from a Crossfire setup. Yeah!
  • 1 Hide
    prightiam , 8 March 2011 18:59
    So this thing is actually louder than the dustbuster?!
  • 2 Hide
    Anonymous , 8 March 2011 19:48
    I'm a bit puzzled by the lack of 580GTX SLi comparison? Once again an incomplete review. Sorry!
  • -1 Hide
    Sabiancym , 8 March 2011 20:28
    A lot of the people who would get these, including me, would water cool them. So the sound issue is gone.

    These things are beasts.
  • 2 Hide
    blubbey , 8 March 2011 20:55
    Crysis has finally been conquered. It only took a few years and an absurd amount of power, however =]
  • 0 Hide
    fruees , 9 March 2011 01:23
    This is why I'd never buy a super-high end card, the ppl that bought the 5990s dropped about £450 4 months ago and now their card isn't even mentioned in the high performance benchmarks!
  • 0 Hide
    evilgenius134 , 9 March 2011 02:09
    frueesThis is why I'd never buy a super-high end card, the ppl that bought the 5990s dropped about £450 4 months ago and now their card isn't even mentioned in the high performance benchmarks!


    The previous card to this is the 5970, and it is mentioned.
  • 0 Hide
    damian86 , 9 March 2011 06:27
    Yes, but still, you can't compare this to nvidias balance between image quality and speed. Radeons balance is unfortunately awful and I cannot see the driver issues getting better,it still need a touch of 'gamer' in software. A few years ago a comparision has been done with both of them and Radeons had a high percentage in causing BSODs.I can still hear people complaining about the drivers...
  • 0 Hide
    asteldian , 9 March 2011 22:00
    I hate 2 cards in one set ups. This just strengthens my distaste for them. I considered the GTX 480 a monster that should never have hit the shelves, now this beast has turned up, at least it can try and justify itself by being a double carder.
    AMD tend to be a bit sloppy with driver support for games at the best of times, I can only imagine the nightmare these will be
  • 0 Hide
    Anonymous , 10 March 2011 00:01
    asteldianAMD tend to be a bit sloppy with driver support for games at the best of times

    People say that, but I've never *ever* had a driver issue with ATI/AMD cards in the 10+ years I've been using them.
  • 0 Hide
    Griffolion , 10 March 2011 22:32
    Impressive stuff but let's wait to see what Nvidia comes out with in the 590. Considering that AMD has broken through the 300W ceiling very considerably, i'd like to see what Nvidia can pull off with a power budget of what the 6990 had.

    I have a 5970, and until games start to slow down that plus my I5 at 4GHz, I won't be buying anything new.
  • 0 Hide
    Solitaire , 11 March 2011 01:25
    Not just this card but everything based on the same PCB is little better than an engineering sample - its too impractical and I expect most board partners will be cooking up slightly shorter cards and vastly superior cooling solutions. Anyone buying a stock card is either mad or preparing to rip that horrid fan clean off and stick a hugeass waterblock on that card...