Sign in with
Sign up | Sign in

Power Consumption

GeForce GTX 690 Review: Testing Nvidia's Sexiest Graphics Card
By

We already know the GeForce GTX 690 sports two GK104s and is priced at around £850. But hardware like this is fun to read about. Oh, you actually want to buy one? Expect performance just shy of two GTX 680s in SLI, and good luck tracking one down

That the latest-generation cards are at the top of this chart speaks volumes about the technologies used by both AMD and Nvidia to reduce the power use of even their highest-end hardware.

Although Nvidia’s GeForce GTX 690 doesn’t fare particularly well overall, it is the second-most power-friendly solution amongst the dual-GPU configurations, just behind two GeForce GTX 680s in SLI.

AMD really struts its stuff when your displays go to sleep. Its ZeroCore technology allows two Radeon HD 7970s to use as much power as one GeForce GTX 680 at idle. And whereas the dual-GPU AMD setup sheds 20 W right off the bat, GeForce GTX 690 is only able to drop 3 W in the same situation.

Each GeForce GTX 680 is rated with a 195 W maximum board power. Each Radeon HD 7970 is rated for 250 W. The GeForce GTX 690’s board power is set at 300 W.

It makes sense, then, that it’d use less power under load compared to twin GeForce GTX 680s. Two Radeons in CrossFire are noticeably more egregious power consumers. Even the GeForce GTX 590 averages higher use than the new GeForce GTX 690.

Truly, this is where the Kepler architecture’s emphasis on performance/watt shines. It would have been nice to see Nvidia spend more time cutting power at idle, like AMD, given the majority of time we spend not gaming. However, the savings under load are certainly impressive.

Display all 8 comments.
This thread is closed for comments
  • 2 Hide
    tracker45 , 3 May 2012 22:09
    poor price/performance ratio though.
  • 3 Hide
    tracker45 , 3 May 2012 22:17
    the 7950 wins this battle
  • 0 Hide
    dizzy_davidh , 4 May 2012 13:45
    As I've posted here a lot of times, I don't know how you perform your testing but I always find that most of the scores you report for nVidia models are down on my own stock-system tests by quite a bit (btw, I have access to play with and test most new devices except for the brand-spanking new).

    In the pocket of AMD or just bad testing?!?

    One example is your GTX 590 results in relation to it's capable resolutions versus the AMDs where I find your results fall short some 15-20 fps short!
  • 2 Hide
    SSri , 4 May 2012 16:41
    I would have loved the usual computing, number crunching and video encoding as a part of the benchmark...
  • 1 Hide
    Rattengesicht , 6 May 2012 18:37
    Quote:
    I would have loved the usual computing, number crunching and video encoding as a part of the benchmark...


    Waste of time. Nvidia cripples their Geforce cards massively when it comes to GPGPU stuff. Those are straight gaming cards, absolutely useless for everything else.
    Just look at their incredibly horrible performance when it comes to FP64 or even just OpenCL stuff.
    Makes me sad having to switch to AMD just because I can't afford spending a few thousand dollars on a quadro for semi-pro things.
  • 1 Hide
    damian86 , 9 May 2012 08:06
    I don't really see the point in having a third DVI port instead of a HDMI?
    plus how about having SLI through a HDMI interface instead? would this make a difference?
  • 0 Hide
    asal1980 , 16 May 2012 15:37
    Lamborghini’s Reventón. They’re all million-dollar-plus automobiles that most of us love to read about, probably won’t see on the road, and almost certainly will never own.
  • 0 Hide
    s1ddy , 25 August 2012 00:43
    Running 2 690's in SLI and am VERY happy with them all my games run smoothly at 130-150Fps all settings maxed out, only thing is setting up Surround.., it's a pain to find the ports that need to be used for a TriMon setup :p .. The hot air blowing into my case was simply fixed by proper airflow (turning the front fan from suck to blow fixed it, using a NXZT Phantom so the 1 big and 2 small sidefans compensate perfectly). Maybe if ATI answers nVidia in a good way i'll switch to ATI but for now i'm sticking to nVidia's Keplers ;)