GeForce GTX 690 Review: Testing Nvidia's Sexiest Graphics Card

Overclocking And Tessellation Performance

Overclocking GeForce GTX 690

Beyond simply building a more capable cooler, Nvidia claims that hand-picking low-leakage GK104 GPUs helps minimize the GeForce GTX 690’s thermal output. As a result, the card slides in under a 300 W TDP. But company representatives say there is plenty of headroom left in the card for clock rates beyond the stock 915 MHz base and 1019 MHz typical GPU Boost frequency.

Using EVGA’s excellent Precision X tool, we managed to push a 150 MHz core and 25 MHz memory offset using a 20%-higher power target. Stability was marginal at those settings, though, so we nudged the voltage up from its .988 V default up to 1.025 V, which kept the card from crashing.

The resulting gains aren’t bad, ranging from a 13%+ speed-up in Battlefield 3 to a 5%+ boost in Metro 2033 at 2560x1600.


You can call it tradition by this point. Our examination of tessellation scaling is intended to quantify claims that both Nvidia and AMD make regarding continually-improving implementations of geometry processing. We like to use real-world metrics where possible, and HAWX 2 gives us an easy on/off toggle for applying additional vertices.

The only real take-away here is that a GeForce GTX 690 does as well as two GeForce GTX 680s in SLI, which improve on what a single GeForce GTX 680 achieves on its own. We’re not sure why the 680 bleeds off so much of its performance when you turn tessellation on, but there’s clearly a bottleneck hammering the frame rate harder than geometry.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • tracker45
    poor price/performance ratio though.
  • tracker45
    the 7950 wins this battle
  • dizzy_davidh
    As I've posted here a lot of times, I don't know how you perform your testing but I always find that most of the scores you report for nVidia models are down on my own stock-system tests by quite a bit (btw, I have access to play with and test most new devices except for the brand-spanking new).

    In the pocket of AMD or just bad testing?!?

    One example is your GTX 590 results in relation to it's capable resolutions versus the AMDs where I find your results fall short some 15-20 fps short!
  • SSri
    I would have loved the usual computing, number crunching and video encoding as a part of the benchmark...
  • Rattengesicht
    Anonymous said:
    I would have loved the usual computing, number crunching and video encoding as a part of the benchmark...

    Waste of time. Nvidia cripples their Geforce cards massively when it comes to GPGPU stuff. Those are straight gaming cards, absolutely useless for everything else.
    Just look at their incredibly horrible performance when it comes to FP64 or even just OpenCL stuff.
    Makes me sad having to switch to AMD just because I can't afford spending a few thousand dollars on a quadro for semi-pro things.
  • damian86
    I don't really see the point in having a third DVI port instead of a HDMI?
    plus how about having SLI through a HDMI interface instead? would this make a difference?
  • asal1980
    Lamborghini’s Reventón. They’re all million-dollar-plus automobiles that most of us love to read about, probably won’t see on the road, and almost certainly will never own.
  • s1ddy
    Running 2 690's in SLI and am VERY happy with them all my games run smoothly at 130-150Fps all settings maxed out, only thing is setting up Surround.., it's a pain to find the ports that need to be used for a TriMon setup :P.. The hot air blowing into my case was simply fixed by proper airflow (turning the front fan from suck to blow fixed it, using a NXZT Phantom so the 1 big and 2 small sidefans compensate perfectly). Maybe if ATI answers nVidia in a good way i'll switch to ATI but for now i'm sticking to nVidia's Keplers ;)