Sign in with
Sign up | Sign in

Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked

Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked
By , Igor Wallossek

Hot on the heels of AMD's Radeon R9 290X receiving acclaim for a fair price and high performance, Nvidia is launching its fastest single-GPU gaming card ever: GeForce GTX 780 Ti. It's quicker than 290X, but also more expensive. Is the premium worthwhile?

GeForce GTX Titan is a super-fast graphics card, right? We know it employs a trimmed-back version of Nvidia’s GK110 GPU, and sure, we’ve often wondered what a fully-functional version of the processor could do. But given the board’s once-uncontested performance lead and its butt-clenching £800 price tag, it was never a sure thing that GK110, uncut, would ever surface on the desktop.

After all, GK110 is a 7.1-billion-transistor GPU. And Nvidia is already (happily) selling a 2880-core version into £4000 Quadro K6000 cards.

Competition has a way of altering perspective, though. AMD’s Radeon R9 290X launch wasn’t perfect. However, it taught us that the Hawaii GPU, properly cooled, can humble Nvidia’s mighty Titan at a much lower price point.

Not to be caught off-guard, Nvidia was already binning its GK110B GPUs, which have been shipping since this summer on GeForce GTX 780 and Titan cards. The company won’t get specific about what it was looking for, but we have to imagine it set aside flawless processors with the lowest power leakage to create a spiritual successor for GeForce GTX 580. Today, those fully-functional GPUs drop into Nvidia’s GeForce GTX 780 Ti.

GK110 In Its Fully Glory

That’s right—we’re finally getting a glimpse of GK110 with all of its Streaming Multiprocessors turned on. So, GeForce GTX 780 Ti features a total of 2880 CUDA cores and 240 texture units. For the sake of completeness, we can work backward: given 192 shaders per SMX, we have 15 working blocks, and with three SMX blocks per Graphics Processing Cluster, there are five of those operating in parallel, too.

This is one SMX more than GeForce GTX Titan, with its 2688 CUDA cores, enjoys. So, you get 192 additional shaders and 16 more texture units. Nvidia also turns up the GPU’s clock rates too, though. Titan’s base clock is 837 MHz and its typical GPU Boost frequency is specified at 876 MHz. GTX 780 Ti starts at 875 MHz and, Nvidia says, can be expected to stretch up to 928 MHz in most workloads.

GK110’s back-end looks the same. Six ROP partitions handle up to eight pixels per clock, adding up to 48 ROP units. A sextet of 64-bit controllers facilitate a familiar 384-bit aggregate memory bus. Only, rather than dropping 1500 MHz modules onto it like the company did with Titan, Nvidia leans on the latest 1750 MHz memory, yielding a 7000 Gb/s data rate and up to 336 GB/s of bandwidth.

The design decision that’ll probably trigger the most controversy is Nvidia’s choice to use 3 GB of GDDR5, down from Titan’s 6 GB. In today’s games, I’ve tested 3 GB cards like the Radeon R9 280X at up to 3840x2160 and not had issues running out of memory. You will, however, have trouble with three QHD screens at 7680x1440. Battlefield 4, for example, goes right over 3 GB of memory usage at that resolution. You’ll be fine at 5760x1080 and Ultra HD for now, but on-board GDDR5 will become a bigger issue moving forward.

Is GeForce GTX 780 Ti More Titanic Than Titan?

At this juncture, the most natural question to ask is: well what about the £800 GeForce GTX Titan? Nvidia is calling GeForce GTX 780 Ti the fastest gaming graphics card ever, and it’s selling for under £600. That’s less than Titan for a card with technically superior specifications.

Titan lives on as a solution for CUDA developers and anyone else who needs GK110’s double-precision compute performance, but is not beholden to the workstation-oriented ECC memory protection, RDMA functionality, or Hyper-Q features you’d get from a Tesla or Quadro card. Remember—each SMX block on GK110 includes 64 FP64 CUDA cores. A Titan card with 14 active SMXes, running at 837 MHz, should be capable of 1.5 TFLOPS of double-precision math.

GeForce GTX 780 Ti, on the other hand, gets neutered in the same way Nvidia handicapped its GTX 780. The card’s driver deliberately operates GK110’s FP64 units at 1/8 of the GPU’s clock rate. When you multiply that by the 3:1 ratio of single- to double-precision CUDA cores, you get a 1/24 rate. The math on that adds up to 5 TFLOPS of single- and 210 GFLOPS of double-precision compute performance.

That’s a compromise, no question. But Nvidia had to do something to preserve Titan’s value and keep GeForce GTX 780 Ti from cannibalizing sales of much more expensive professional-class cards. AMD does something similar with its Hawaii-based cards (though not as severe), limiting DP performance to 1/8 of FP32.

And so we’re left with GeForce GTX 780 Ti unequivocally taking the torch from Titan when it comes to gaming, while Titan trudges forward more as a niche offering for the development and research community. The good news for desktop enthusiasts is that Nvidia’s price bar comes down £200, while performance goes up.

Now, is that enough flip the script on AMD and its Radeon R9 290X? The company is still selling at a very attractive (for ultra-high-end hardware) £450 price point, after all. Here’s the thing: as you saw two days ago from our R9 290 coverage, retail cards are rolling into our lab, and we’re not seeing the same Titan-beating performance that manifested in Radeon R9 290X Review: AMD's Back In Ultra-High-End Gaming. With only a handful of data points pegging 290X between GeForce GTX 770 and 780, and quicker than Titan, consistency appears to be AMD’s enemy right now. Company representatives confirm that there's a discrepancy between between absolute fan speed and its PWM controller, and is working to remedy this with a software update. Our German team continued investigating as I peeled off to cover GeForce GTX 780 Ti, and demonstrated that the press and retail cards are spinning at different fan speeds. But there's more to this story relating to ambient conditions, so you'll be hearing more about it soon.

Nvidia is seizing on this issue in the meantime, and with good reason. With clock rates ranging from 727 to 1000 MHz on our Radeon R9 290X cards, and AMD’s reference thermal solution limiting performance at different frequencies in different games, we couldn’t draw a conclusion one way or the other in AMD Radeon R9 290 Review: Fast And £320, But Is It Consistent? Can we be any more definitive about Nvidia’s response to all of the Hawaii news?

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 15 comments.
This thread is closed for comments
  • 0 Hide
    grebgonebad , 7 November 2013 14:42
    No 3D mark score? Shame, I would have been interested to see how it fared. =(

    Just a question to the people who run the benchmarks, could you start including Passmark scores with your benchmark lists please? I regularly use this benchmark myself and would love to see some results from cards that I cannot get my hands on. =)
  • 1 Hide
    SPLWF , 7 November 2013 22:27
    that R9 290 hits the sweet spot, great price/performance ratio considering the 780ti is $700 and not that far off in FPS's.
  • -1 Hide
    Mousemonkey , 7 November 2013 22:32
    Quote:
    Nvidia is always overpriced. When are they gonna get real?


    In Danmark everything is overpriced! :lol: 
  • -1 Hide
    dottorrent , 8 November 2013 00:12
    Such a pointless card. Nvidia just did that to reclaim the record.
  • 0 Hide
    markem , 8 November 2013 05:09
    I was hoping we would see a 780ti retail card alongside these benches
  • 2 Hide
    grebgonebad , 8 November 2013 09:02
    Quote:
    Such a pointless card. Nvidia just did that to reclaim the record.


    Pointless? Really? More powerful, yet dramatically cheaper than the Titan? Obviously sir you own an AMD GPU, not that I'm judging by any means, but I implore you to research your facts before making ridiculous statements such as that.

    FYI, I have previously owned several AMD cards, and probably will do again at some point, I have no bias towards either make.
  • 0 Hide
    grebgonebad , 8 November 2013 09:05
    Quote:
    Nvidia is always overpriced. When are they gonna get real?


    As soon as AMD start making cards that are the same or better performance as Nvidia's cards, but sell them for a cheaper price I'm guessing. =P

    But seriously, while Nvidia are more expensive than AMD, you have to give Nvidia credit where it's due. In order to make a single GPU card that can outperform 99% of the cards on the planet, the remaining 1% of course being dual GPU cards, requires a premium price due to the amount of research that went into creating such a powerful piece of kit.
  • -1 Hide
    dottorrent , 8 November 2013 11:08
    Quote:
    Quote:
    Such a pointless card. Nvidia just did that to reclaim the record.


    Pointless? Really? More powerful, yet dramatically cheaper than the Titan? Obviously sir you own an AMD GPU, not that I'm judging by any means, but I implore you to research your facts before making ridiculous statements such as that.

    FYI, I have previously owned several AMD cards, and probably will do again at some point, I have no bias towards either make.


    Yes, I actually do own an AMD GPU but I am not a fan to just one make. I just see it as just a "we need that record back, fast" moment. Then again, the price is tempting.
  • 1 Hide
    grebgonebad , 8 November 2013 11:50
    Quote:
    Quote:
    Quote:
    Such a pointless card. Nvidia just did that to reclaim the record.


    Pointless? Really? More powerful, yet dramatically cheaper than the Titan? Obviously sir you own an AMD GPU, not that I'm judging by any means, but I implore you to research your facts before making ridiculous statements such as that.

    FYI, I have previously owned several AMD cards, and probably will do again at some point, I have no bias towards either make.


    Yes, I actually do own an AMD GPU but I am not a fan to just one make. I just see it as just a "we need that record back, fast" moment. Then again, the price is tempting.


    I'd just like to apologise for my previous statement. It was rude of me. I had just been given some VERY bad news and was a little angry, so again, I apologise.

    I see your point, Nvidia have always been the top rankers, and I suppose they're getting a little worried now that AMD are getting dangreously close to thier realm of performance for such a difference in price. Even though I do not plan on upgrading my rig for at least a year yet, I will definitely be paying close attention to what AMD shall be releasing next. If thier latest efforts are anything to go by we should be seeing some pretty impressive tech being released. I am especially interested in thier new 'Mantle' technology.
  • -1 Hide
    Mousemonkey , 8 November 2013 12:19
    Quote:
    Quote:
    Quote:
    Such a pointless card. Nvidia just did that to reclaim the record.


    Pointless? Really? More powerful, yet dramatically cheaper than the Titan? Obviously sir you own an AMD GPU, not that I'm judging by any means, but I implore you to research your facts before making ridiculous statements such as that.

    FYI, I have previously owned several AMD cards, and probably will do again at some point, I have no bias towards either make.


    Yes, I actually do own an AMD GPU but I am not a fan to just one make. I just see it as just a "we need that record back, fast" moment. Then again, the price is tempting.


    You seem to be forgetting that Nvidia always felt that they had the upper hand.

    Nvidia: We Expected More from AMD Radeon HD 7970.
  • 0 Hide
    grebgonebad , 11 November 2013 10:05
    To be fair though, Nvidia do have the upper hand, as is proven by thier benchmark scores. =)

    I think though, that judging by AMD's recent efforts combined with Mantle technology Nvidia could be usurped by this time next year. Maybe not by being the most powerful, but as far as value for money goes...
  • 0 Hide
    jkrui01 , 15 November 2013 14:24
    i only come here to laugh, this TOMS is super biased, everyone else favours the AMD.
    Read a REAL review here:
    http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming
  • 0 Hide
    grebgonebad , 15 November 2013 14:47
    Quote:
    i only come here to laugh, this TOMS is super biased, everyone else favours the AMD.
    Read a REAL review here:
    http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming


    Toms is actually a reputable review website who are biased towards neither faction. And so say otherwise makes the person in question themselves biased. Sorry, but that's just the way it is.

    While there are many people who prefer AMD, there are an equal amount of people who prefer Nvidia. Personally I dont favour either one more than the other, as they both have thier strengths and wekanesses. Whereas Nvidia have the most powerful cards overall, AMD sell similar performance cards for a reasonably low price point. It is the same story with Intel and AMD CPU's.

    Either way, to say that the majority of people prefer one manufacturer over the other is a very narrow minded oppinion, and I urge you to widen your gaze and look at the bigger picture. =)

    FYI, I'm not being biased towards Nvidia here either, as I have owned a variety of both AMD and Nvidia cards and like both companies cards equally.
  • 1 Hide
    icezar1 , 26 November 2013 11:00
    I can see 2 wins with this card:

    - lower power consumption when idling, playing blu-ray, running multi-monitor setups;
    - quieter than AMD's Hawaii series.

    There are few downsides though compared to a standard R9 290 (non-OC!):

    - maximum 15% performance increase for about 50% more money;
    - more power hungry in games (not by much but still);

    Compared to the OC R9 290 (1150 GPU/1350 RAM) the 780Ti has.. nothing to offer to gamers.

    Furthermore, Mantle is coming to BF4 in December (and other games later on) :) 
    I'm curious what the benchmarks will say compared to DX11.
    http://www.tomshardware.com/news/amd-mantle-api-gcn-battlefield-4,24418.html
  • 0 Hide
    grebgonebad , 28 November 2013 10:06
    Quote:
    I can see 2 wins with this card:

    - lower power consumption when idling, playing blu-ray, running multi-monitor setups;
    - quieter than AMD's Hawaii series.

    There are few downsides though compared to a standard R9 290 (non-OC!):

    - maximum 15% performance increase for about 50% more money;
    - more power hungry in games (not by much but still);

    Compared to the OC R9 290 (1150 GPU/1350 RAM) the 780Ti has.. nothing to offer to gamers.

    Furthermore, Mantle is coming to BF4 in December (and other games later on) :) 
    I'm curious what the benchmarks will say compared to DX11.
    http://www.tomshardware.com/news/amd-mantle-api-gcn-battlefield-4,24418.html


    Granted, your arguments are sound. But I will bring the Titan into the equation here.

    The 780 was a bargain, at 90% the performance of the Titan for half the price (Thereabouts, depending on what manufacturer obviously), and now that the 780Ti is out with no restrictions in the GPU, you are getting more performance than the Titan (Substantially more if you ask me) for still less than a Titan. =)

    Admittadly Mantle technology does look excellent. I think this is AMD's metaphorical middle finger agains Nvidia's PhysX.