Nvidia’s Turing Architecture Explored: Inside the GeForce RTX 2080

NVLink: A Bridge To…Anywhere?

TU102 and TU104 are Nvidia’s first desktop GPUs rocking the NVLink interconnect rather than a Multiple Input/Output (MIO) interface for SLI support. The former makes two x8 links available, while the latter is limited to one. Each link facilitates up to 50 GB/s of bidirectional bandwidth. So, GeForce RTX 2080 Ti is capable of up to 100 GB/s between cards and RTX 2080 can do half of that.

One link or two, though, SLI over NVLink only works across a pair of GeForce RTX boards with at least one empty slot between them for airflow. Officially, Pascal-era GPUs endured the same two-card maximum. Technically, however, as many as four top-end GeForce GTXes could be made to work together in a handful of benchmarks. These days, you’ll also have to purchase your own GeForce RTX NVLink Bridge for multi-GPU connectivity. Three- and four-slot sizes are both available for $80 from Nvidia’s website.    

Some of the trouble last generation was caused by bandwidth constraints between SLI bridges. Compared to the original SLI interface’s 1 GB/s MIO link, Pascal’s implementation drove ~4 GB/s. That was fast enough to get the second card’s rendered frame back to the primary board in time for smooth output to a 4K monitor at 60 Hz. But it wouldn’t have been able to keep up at 120 Hz and higher, which is where today’s highest-end gaming displays operate.

Even in a single-link configuration, NVLink can move data so quickly that SLI on an 8K screen is possible. Driving three 4K monitors at 144 Hz in Surround mode is no problem at all. Two x8 links have the throughput needed for 8K displays in Surround.

Really, the question is: who cares anymore? AMD and Nvidia did such a good job of pumping the brakes on multi-GPU configurations that Tom’s Hardware readers rarely, if ever, ask for benchmark results from an SLI setup. Back in the day, value-minded gamers used SLI to match the performance of higher-end cards. Nvidia put a stop to that by removing support from lower-end models in its product stack. Now, even GeForce RTX 2070 lacks an NVLink connector. Older DirectX 11-based games still run well across two cards, and a handful of DirectX 12-based titles do exploit the API’s explicit multi-adapter control. But the fact that developers like EA DICE are pouring time into taxing features like real-time ray tracing and ignoring multi-GPU says a lot about SLI’s future.

We’ve heard Nvidia representatives say they’ll have more to discuss on this front in the future. For now, NVLink support on GeForce RTX 2080 Ti and 2080 is a novelty, particularly as we’re able to focus on playable frame rates at 4K and G-Sync technology to keep the action smooth.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

This thread is closed for comments
25 comments
    Your comment
  • siege19
    "And although veterans in the hardware field have their own opinions of what real-time ray tracing means to an immersive gaming experience, I’ve been around long enough to know that you cannot recommend hardware based only on promises of what’s to come."

    So wait, do I preorder or not? (kidding)
  • jimmysmitty
    Well done article Chris. This is why I love you. Details and logical thinking based on the facts we have.

    Next up benchmarks. Can't wait to see if the improvements nVidia made come to fruition in performance worthy of the price.
  • Lutfij
    Holding out with bated breath about performance metrics.
    Pricing seems to be off but the followup review should guide users as to it's worth!
  • Krazie_Ivan
    i didn't expect the 2070 to be on TU106. as noted in the article, **106 has been a mid-range ($240-ish msrp) chip for a few generations... asking $500-600 for a mid-range GPU is insanity. esp since there's no way it'll have playable fps with RT "on" if the 2080ti struggles to maintain 60. DLSS is promisingly cool, but that's still not worth the MASSIVE cost increases.
  • jimmysmitty
    904774 said:
    i didn't expect the 2070 to be on TU106. as noted in the article, **106 has been a mid-range ($240-ish msrp) chip for a few generations... asking $500-600 for a mid-range GPU is insanity. esp since there's no way it'll have playable fps with RT "on" if the 2080ti struggles to maintain 60. DLSS is promisingly cool, but that's still not worth the MASSIVE cost increases.


    It is possible that they are changing their lineup scheme. 106 might have become the low high end card and they might have something lower to replace it. This happens all the time.
  • Lucky_SLS
    turing does seem to have the ability to pump up the fps if used right with all its features. I just hope that nvidia really made a card to power up its upcoming 4k 200hz hdr g sync monitors. wow, thats a mouthful!
  • anthonyinsd
    ooh man the jedi mind trick Nvidia played on hyperbolic gamers to get rid of thier overstock is gonna be EPIC!!! and just based on facts: 12nm gddr6 awesome new voltage regulation and to GAME only processes thats a win in my book. I mean if all you care is about is your rast score, then you should be on the hunt for a titan V, if it doesn't rast its trash lol. been 10 years since econ 101, but if you want to get rid of overstock you dont tell much about the new product till its out; then the people who thought they were smart getting the older product, now want o buy the new one too....
  • none12345
    I see a lot of features that are seemingly designed to save compute resources and output lower image quality. With the promise that those savings will then be applied to increase image quality on the whole.

    I'm quite dubious about this. My worry is that some of the areas of computer graphics that need the most love, are going to get even worse. We can only hope that overall image quality goes up at the same frame rate. Rather then frame rate going up, and parts of the image getting worse.

    I do not long to return to the day where different graphics cards output difference image quality at the same up front graphics settings. This was very annoying in the past. You had some cards that looked faster if you just looked at their fps numbers. But then you looked at the image quality and noticed that one was noticeably worse.

    I worry that in the end we might end up in the age of blur. Where we have localized areas of shiny highly detailed objects/effects layered on top of an increasingly blurry background.
  • CaptainTom
    I have to admit that since I have a high-refresh (non-Adaptive Sync) monitor, I am eyeing the 2080 Ti. DLSS would be nice if it was free in 1080p (and worked well), and I still don't need to worry about Gstink. But then again I have a sneaking suspicion that AMD is going to respond with 7nm Cards sooner than everyone expects, so we'll see.

    P.S. Guys the 650 Ti was a 106 card lol. Now a xx70 is a 106 card. Can't believe the tech press is actually ignoring the fact that Nvidia is relabeling their low-end offering as a xx70, and selling it for $600 (Halo product pricing). I swear Nvidia could get away with murder...
  • mlee 2500
    4nm is no longer considered a "Slight Density Improvement".

    Hasn't been for over a decade. It's only lumped in with 16 from a marketing standpoint becuase it's no longer the flagship lithography (7nm).
  • TMTOWTSAC
    In a perfect world, the non-RT models would be based off the TU architecture without any of the RT silicon, and priced accordingly. They're claiming RT is the must have feature and subsequently worth the price premium. Given those claims it's going to be very interesting to see what pricing scheme they go with for the non-RT models.
  • mlee 2500
    Great article, very informative, thank you for taking the time to write it.
  • dimar
    No need to waste your hard earned money. AMD Navi is around the corner. And if Navi isn't that good, RTX prices will be lower by then. With AMD you get freesync which most monitors have these days.
  • Reynod
    Fantastic read as always Chris.

    Objective, with warts ... an easy read ... informative ... with detail.

    I hope you are editing the article that gets released here with the benchies once the NDA is lifted.

    I will spend money based on that content ...
  • cangelini
    Thanks guys.

    Yes, I will be spending a long caffeine-fueled weekend with graphics cards, Excel, and Word. Let me know if there are any specific requests on comparisons you'd like to see made!
  • truerock
    I've been running my Nvidiia Geforce GTX 690 for 6 years. It does 3840 x 2160 at 30fps.
    The lack of HDMI 2.1 is just enough of a negative to keep me from buying a Geforce RTX 2080 Ti.
    I guess it is ironic that I actually don't want HDMI or DisplayPort outputs on my Nvidia cards. I want Nvidia cards that only have USB-C output ports.
    Oh well - maybe next year. My Nvidiia Geforce GTX 690 will be 7 years old.
  • truerock
    Chris,

    Thanks for the review. It's the best I've seen on these cards so far.

    I'm interested in 3840 x 2160 at 120fps. That would be with the more popular games. What settings for a specific game allow 3840 x 2160 at 120fps vs 3840 x 2160 at 60fps and 3840 x 2160 at 30fps. I'm not interested in g-sync. Does graphics quality suffer much as settings are pushed down to allow higher frame rates?
  • bit_user
    134065 said:
    Let me know if there are any specific requests on comparisons you'd like to see made!

    Crysis @ 4k? ...you know someone will ask it. And Anandtech tested it on the Titan V, so we can compare.

    https://www.anandtech.com/show/12170/nvidia-titan-v-preview-titanomachy/8
  • cangelini
    Before they did that, I did this: https://www.tomshardware.com/reviews/crysis-10-year-anniversary-benchmarks,5329.html ;)

    Time's going to be tight, but I'll see if I can throw it on the test system.
  • Reynod
    I agree ... if you still have the Original Crysis game ... then answer "But will it play Crysis?".

    The original Badly coded game please?

    I imagine you will alsso have received a couple of iterations of drivers since receiving the card, so let us know how much improvement you found with these?

    Finally, when you finish can you pull the HSF off and let us know anything about the TIM you find?


    :)
  • bit_user
    123704 said:
    The original Badly coded game please?

    Uh, it should be comparable to the other Crysis benchmarks, please.
  • Reynod
    Ok then both of them ...
  • kyotokid
    ...well the 2070 sounds like the RTX stepchild. No linking capability which means no way to improve frame rate. So my thinking is who would buy this card?
  • Crazyjay53
    So why so hurry getting these cards if there any game that run on rtx , it gonna take awhile for game software to add it in game ,ill stick with my 1080ti for awhile till they get benchmark on those rtx if they are worth it