GeForce GTX 690 Review: Testing Nvidia's Sexiest Graphics Card

PCI Express 3.0 And What Of GK110?

PCI Express 3.0 Representing

There’s a story behind Nvidia’s support for third-gen PCI Express and Intel’s X79 Express platform. But it requires a little bit of history.

Way back when I first previewed Sandy Bridge-E (check out Intel Core i7-3960X (Sandy Bridge-E) And X79 Platform Preview for that little piece of history), everyone I talked to insisted that the processor’s PCIe controller wasn’t going to be validated at 8 GT/s data rates. It'd be a PCIe 2.0 part. Then, suddenly the story changed and it was called 8 GT/s-capable (though mention of the standard itself was left out).

When AMD launched its Radeon HD 7000-series cards, we were able to demonstrate them operating at PCI Express 3.0 signaling speeds. Then, Nvidia launched its GeForce GTX 680—with a press driver that was limited to 5 GT/s. The company sent us a second version to show that PCI Express 3.0 was working, and assured us that it’d operate at 8 GT/s on Ivy Bridge-based platforms (which we’ve since confirmed).

Why not just ship it like that? There was a reason, we are digging deeper, but aren’t yet ready to discuss the results.

Let’s put the puzzle pieces together, though.

  1. X79 and Sandy Bridge-E were originally going to operate at second-gen signaling rates.
  2. GeForce GTX 680, a card that scales really well in SLI, operates at 5 GT/s data rates attached to Sandy Bridge-E processors and 8 GT/s in Ivy Bridge-based platforms.
  3. GeForce GTX 690 offers 8 GT/s signaling in both Sandy Bridge-E and Ivy Bridge-based platforms.

The issue doesn’t appear to be related to GK104, Nvidia’s card, or its driver. Rather, it’d seem to relate back to our original report that Sandy Bridge-E was not fully validated for PCI Express 3.0.

Is This It For Affluent Gamers In 2012?

I saw a lot of comments from folks who read GeForce GTX 680 2 GB Review: Kepler Sends Tahiti On Vacation and decided they wanted to wait for Nvidia to launch a desktop-oriented card based on a more complex graphics processor—if only because they were unwilling to pay £400 for the company’s next-gen “Hunter” (if you don’t know what I’m talking about, check out the first page of my GeForce GTX 680 review).

On behalf of those folks, I plied Nvidia for more information about a proper “Tank” in the GeForce GTX 600-series. Although the company’s representatives were deliberately vague about the existence of another GPU, they clearly indicated that GeForce GTX 690 wouldn’t be eclipsed any time soon. Personally, I’d be surprised to see anything based on a higher-end GPU before Q4.

Even then, there’s no guarantee that a tank-class card would outperform two GK104s (GF104 had little trouble destroying GF100 in Amazing SLI Scaling: Do Two GeForce GTX 460s Beat One GTX 480?, after all). The more likely outcome would be a better-balanced GPU able to game and handle compute-oriented tasks.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • tracker45
    poor price/performance ratio though.
  • tracker45
    the 7950 wins this battle
  • dizzy_davidh
    As I've posted here a lot of times, I don't know how you perform your testing but I always find that most of the scores you report for nVidia models are down on my own stock-system tests by quite a bit (btw, I have access to play with and test most new devices except for the brand-spanking new).

    In the pocket of AMD or just bad testing?!?

    One example is your GTX 590 results in relation to it's capable resolutions versus the AMDs where I find your results fall short some 15-20 fps short!
  • SSri
    I would have loved the usual computing, number crunching and video encoding as a part of the benchmark...
  • Rattengesicht
    Anonymous said:
    I would have loved the usual computing, number crunching and video encoding as a part of the benchmark...

    Waste of time. Nvidia cripples their Geforce cards massively when it comes to GPGPU stuff. Those are straight gaming cards, absolutely useless for everything else.
    Just look at their incredibly horrible performance when it comes to FP64 or even just OpenCL stuff.
    Makes me sad having to switch to AMD just because I can't afford spending a few thousand dollars on a quadro for semi-pro things.
  • damian86
    I don't really see the point in having a third DVI port instead of a HDMI?
    plus how about having SLI through a HDMI interface instead? would this make a difference?
  • asal1980
    Lamborghini’s Reventón. They’re all million-dollar-plus automobiles that most of us love to read about, probably won’t see on the road, and almost certainly will never own.
  • s1ddy
    Running 2 690's in SLI and am VERY happy with them all my games run smoothly at 130-150Fps all settings maxed out, only thing is setting up Surround.., it's a pain to find the ports that need to be used for a TriMon setup :P.. The hot air blowing into my case was simply fixed by proper airflow (turning the front fan from suck to blow fixed it, using a NXZT Phantom so the 1 big and 2 small sidefans compensate perfectly). Maybe if ATI answers nVidia in a good way i'll switch to ATI but for now i'm sticking to nVidia's Keplers ;)