AMD Radeon R9 Fury X 4GB Review

Power Consumption

Introducing the New and Improved PowerTune

As you looked at all of those performance numbers, did you find yourself wondering if AMD could significantly reduce Radeon R9 Fury X's power consumption? We slightly modified our measurement setup in anticipation. This allows us to analyze how well AMD’s updated PowerTune technology is able to adjust the card’s draw in response to load, temperature and other variables. We're looking for fast response in short intervals. And in order to test accurately, our equipment is set for 100-microsecond sampling in order to record every last fluctuation.

It turns out that PowerTune takes big strides toward improved resolution and gradation! In spite of us running up against the limits of our logging technology, yielding curves that don’t look as nice anymore in spite of high-cut filtering, the bottom line is undeniable: PowerTune is now able to react to parameter changes in intervals of 10 microseconds or less. The following chart shows what happens over a time period of just 100 microseconds.

AMD’s engineers deserve some praise; these results look like they come from a card based on Nvidia's Maxwell architecture. Now we want to know how our observations are reflected in the individual load scenarios. Theoretically speaking, the power consumption at idle should be markedly lower, whereas the stress test might, unfortunately, trigger a massive increase. That is unless AMD set a conservative limit just like Nvidia did, since nothing can be predicted, and thus saved, under continuous full load. We’ll answer all of these questions in detail below.

The peaks in these tables don’t always occur at the same time, since the total peak power consumption isn’t always the sum of the peaks on the individual rails. Maximums can happen at different times.

Idle Power Consumption

AMD’s Radeon R9 Fury X meets our lofty expectations. On the one hand, HBM consumes a lot less power than GDDR5. On the other, there's a pump and a larger fan that need to be taken into consideration. Their combined consumption shouldn’t be much lower than 2W, though. In light of this, the 4.8W we measure is a fantastic result. And it holds up even when we connect more than one monitor. AMD has finally taken care of this anachronism, which was long overdue. The end result comes in well under Nvidia's reference GeForce GTX 980 Ti as well.

 MinimumMaximumAverage
PCIe Total0.00W 15.60W 2.88W 
Motherboard 3.3V0.00W 0.99W 0.26W 
Motherboard12V0.00W 7.80W 1.65W 
Graphics Card Total0.00W 21.96W 4.80W 

We also see that the auxiliary 3.3V rail isn't really a factor anymore. The other rails fluctuate between zero and a peak here and there. The average looks great though, and this is reflected in each rail's measurement curve.

Overview of all tested graphics cards

Gaming Power Consumption

This is the result that most of you will be interested in after checking out the gaming benchmarks. And again, the news looks good. An average just under 221W for AMD’s Radeon R9 Fury X after a warm-up period (it registers 216W cold) is well under what we observed from Nvidia's reference GeForce GTX 980 Ti! The 12W difference might not be a lot, but it's enough to send a clear message.

A Fiji GPU with some of its resources disabled or dialed to a lower clock rate would likely be more efficient, too. That makes the hotly anticipated Radeon R9 Fury Nano a much more plausible concept.

 MinimumMaximumAverage
PCIe Total25.20W 428.80W 209.81W 
Motherboard 3.3V0.00W 1.32W 0.33W 
Motherboard 12V0.00W 31.20W 10.55W 
Graphics Card Total25.53W 453.58W 220.69W 

Once again, the 3.3V rail isn't really used. It's joined by the motherboard's 12V rail, which apparently isn't needed either. This is particularly notable, since it means that almost all of the heavy lifting is done by the auxiliary PCIe power connectors. As a pleasant side effect, potential high-frequency load fluctuations of more than 100KHz don’t travel through the supply line, where they can reach and modulate the audio components.

The individual rails are interesting to look at, since the load fluctuations can be identified quite easily:

Overview of all tested graphics cards

Stress Test Power Consumption

If AMD hadn't imposed a strict limit, power consumption might have veered out of control under full load. And sure enough, the Radeon R9 Fury X can get warm and cozy just under 348W. Granted, this only happens under what the company would consider a power virus, or some OpenCL-based task capable of hammering Fiji's shaders. Compute-heavy workloads also tend to run for longer than typical synthetic benchmarks.

Given the liquid cooler, AMD grants its Radeon R9 Fury X a much more generous power ceiling than Nvidia permits for GeForce GTX 980 Ti. If that extra ~100W of draw can be considered good or bad is up for debate. Regardless, it's unlikely that you'll ever see this limit under normal use.

 MinimumMaximumAverage
PCIe Total30.24W 421.20W 327.80W 
Motherboard 3.3V0.00W 1.65W 0.68W 
Motherboard 12V2.60W 31.20W 19.02W 
Graphics Card Total40.32W 448.52W 347.49W 

Here are the separate curves for the individual rails one last time:

Overview of all tested graphics cards

AMD does a great job managing power, and we're happy to see the company competing on a more even footing with Nvidia in this area as well. The Radeon R9 Fury X even manages to beat Nvidia’s reference GeForce GTX 980 Ti due to the Fury's better cooling and the resulting lower leakage currents.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
13 comments
Comment from the forums
    Your comment
  • cdrkf
    I just wanted to say a big Thank You to Toms for a very fair apprasal of the new card. It's very surprising that AMD have managed to get the power consumption down so much as well.

    It's also interesting how nVidia and AMD have both arrived at roughly equivalent sized gpu's with roughly equivalent performance through 2 very different approaches on the same process node. I think this is essentially the last 'hurrah' for 28nm... role on 14nm (or 16nm for TSMC) ff for the next gen!
  • BigBadBeef
    The R9 295X2... just... will... not... yield!

    Savage, SAVAGE creature that just keeps fighting for first place like nobody's business!
  • logainofhades
    Looking forward to what Nano offers.
  • kyzarvs
    But... can it play Batman?

    Oh wait, nothing can...
  • wtfxxxgp
    Love this from AMD! I'm not a brand-whore so I'm absolutely loving what this will do to pricing of these "ultra high-end" GPU's. It's about time that AMD releases something they can truly be very proud of because it's been too long that NVidia has been the head of the table w.r.t. efficiency.
  • complete_minger
    I don't want to bash Amd and I hope they can pull their finger out, but its aimed at 4k gaming and there is no HDMI 2.0 support? Whyyyyy?

    I may be wrong but I understand that it costs the same as a 980 ti. Sadly, I would have to go with nvidia, which is a shame but I was hoping for just that bit more from the 390x
  • cdrkf
    1997167 said:
    I don't want to bash Amd and I hope they can pull their finger out, but its aimed at 4k gaming and there is no HDMI 2.0 support? Whyyyyy? I may be wrong but I understand that it costs the same as a 980 ti. Sadly, I would have to go with nvidia, which is a shame but I was hoping for just that bit more from the 390x


    Lets put this in perspective, HDMI 2.0 is only needed for *4K TV*. The display port connections on Fury support full 4k at 60 fps and display port is standard on 4k monitors. Also to those complaining about no DVI-DL connector, you can convert DP to DVI-DL really easily with a single connector which costs next to nothing, and recent photos of an 'Un-boxing of Fury' showed one of these connectors supplied FOC in the box so I don't see it as a problem.

    I agree it's a shame they missed HDMI 2, as the upcoming low power 'nano' version would have been ideal to put in a Steam Box to run a TV. It's still not as big a problem as it's being made out to be though imo.
  • complete_minger
    1282978 said:
    1997167 said:
    I don't want to bash Amd and I hope they can pull their finger out, but its aimed at 4k gaming and there is no HDMI 2.0 support? Whyyyyy? I may be wrong but I understand that it costs the same as a 980 ti. Sadly, I would have to go with nvidia, which is a shame but I was hoping for just that bit more from the 390x
    Lets put this in perspective, HDMI 2.0 is only needed for *4K TV*. The display port connections on Fury support full 4k at 60 fps and display port is standard on 4k monitors. Also to those complaining about no DVI-DL connector, you can convert DP to DVI-DL really easily with a single connector which costs next to nothing, and recent photos of an 'Un-boxing of Fury' showed one of these connectors supplied FOC in the box so I don't see it as a problem. I agree it's a shame they missed HDMI 2, as the upcoming low power 'nano' version would have been ideal to put in a Steam Box to run a TV. It's still not as big a problem as it's being made out to be though imo.


    my bad, never realised it supported 4k at 60hz over DP
  • Tuomas Viitala
    Quote:
    The R9 295X2... just... will... not... yield! Savage, SAVAGE creature that just keeps fighting for first place like nobody's business!


    295x2 is a dual gpu it's kind of the same as 290x... so if there'd be 2titans they would rip a 295x2 in to pieces
    [
  • BigBadBeef
    I don't see how this is relevant to the matter at any single way in which you approach it? Its one device... ONE! One, not two, a single piece, something you stick in and run, no strings attached, no crossfire cables needed.

    Now if the notion that your favourite GPU manufacturer didn't think of that as well offends you, that's too bad, take it up with nVidia!
  • cdrkf
    Nvidia did think of a dual gpu card. It's called the titan Z. It came out around same time as 295x2, cost twice the price and performed worse than the amd card, which is why it doesn't get mentioned lol. To be fair it's unusual for nvidia to get things really wrong but it does happen :P

    Also even if nvidia release a dual titan x, furry x2 is already confirmed and on the way (that is going to be one powerful card!).
  • Tuomas Viitala
    Yes but some games that doesn't support SLI / CF doesn't run well with a 295x2 ;) and nvidia has thought of a titan x2 wich will be the ultimate card for couple years again :p
  • BigBadBeef
    ***ENOUGH!***

    The specs of both manufacturers are out there, they are objective and are NOT open to interpretation! I am telling you to stop this voodoo mumbo speculative dick measuring competition, its unbecoming to to this entire forum!

    AMD dick is the biggest... oh nonono, nvidia has the bigger dick... oh please...