Sign in with
Sign up | Sign in

OpenCL And CUDA Are Go: GeForce GTX Titan, Tested In Pro Apps

OpenCL And CUDA Are Go: GeForce GTX Titan, Tested In Pro Apps
By

We initially had trouble getting the GeForce GTX Titan to work with OpenCL and CUDA. Finally, though, there are drivers available that fix all of that. Now we can figure out if the Titan makes a good workstation-oriented alternative to Nvidia's Quadros.

We covered Nvidia's still-new GeForce GTX Titan in Nvidia GeForce GTX Titan 6 GB: GK110 On A Gaming Card and Benchmarking GeForce GTX Titan 6 GB: Fast, Quiet, Consistent. As a gaming product, we know it to be the fastest single-GPU board you can buy. But how does the vaunted Titan fare in professional applications? It wasn't possible to run a number of tests for the launch because Nvidia's drivers weren't working in most of the non-gaming titles we tried.

Nevertheless, you'd think that, given its GK110 GPU, first introduced on a couple of Nvidia's Tesla accelerator boards, the GeForce GTX Titan would be a shoo-in for a market that doesn't flinch at £1000 graphics cards. So, we're taking it, along with a number of other desktop-oriented graphics cards (like the Radeon HD 7970, Radeon HD 6970, GeForce GTX 680, and GeForce GTX 580) to see how the last two generations of flagship gaming products handle workstation-class software.

We're using a Titan card that Gigabyte sent over. It's based on Nvidia's reference design, though Gigabyte does throw in some extras to set its offering apart. There's a large mouse pad, a deck of playing cards, some cables, and obligatory adapters. 

The previous-gen processor in our test bed was swapped out in favor of an overclocked Core i7-3770K to help minimize platform bottlenecks. Getting to the point where we didn't see application performance change based on processor performance took a clock rate of 4.6 GHz, which just goes to show that older software is still CPU-limited. Optimizations for threading, CUDA, and OpenCL are playing a larger role in rendering tasks, but some workloads still aren't being parallelized.

Benchmark System
CPU
Intel Core i7-3770K (Ivy Bridge), 22 nm, 4C/8T, 8 MB Shared L3 Cache, Hyper-Threading Enabled, Overclocked to 4.6 GHz
RAM
32 GB Corsair Dominator Platinum @ 2,066 MT/s
Motherboard
Gigabyte G1.Sniper 3, Intel Z77 Express
SSD
2 x Corsair Neutron 480 GB
OS  
Windows 7 Ultimate x64 (Fully Patched)
Drivers
GeForce 314.22 WHQL
Catalyst 13.3 Beta 3


We already know what happens when the Tesla's GK110 GPU is tossed into a gaming environment. So, what happens when we put that same hardware to work in a professional sense?

Today's story also serves as a preview for a big workstation graphics card round-up we have coming up with all of the new Kepler-based Quadro cards. We're going to use the same benchmarks (and a lot more) to compare two generations of Nvidia and AMD offerings. Right now, we're still sorting out some driver issues that show why it's so important for these companies to seek out certifications for their premium products. You'll see us add the results from these gaming cards to that piece, too.

Display 6 Comments.
This thread is closed for comments
  • 0 Hide
    Valentin_N , 17 April 2013 14:40
    Thanks for complex and detailed review!
  • 0 Hide
    HEXiT , 17 April 2013 15:29
    my 1 issue with this article is you say compared to workstation cards but you posted no numbers for any workstation card to see how the gaming grade systems compared. is this because the gaming grade cards murder the workstation cards in most things?
  • 0 Hide
    HEXiT , 17 April 2013 15:32
    for the most part great stuff, but i was a bit disappointed with it tbh.
    i was expecting some numbers from workstation cards to compare the gaming grade to. after all this is where workstation cards should excel so seeing them trounce a $1000 gaming card would have been interesting or was it the other way round for most everything?



    BTW i cant post this on the article itself every time i try i get an error, i had to come into the forum to post.
  • 0 Hide
    sicofante , 17 April 2013 19:45
    @HEXiT: If you read carefully the full article, it says in several occasions that an upcoming review of the Titan vs pro-grade cards is in the works.
  • 0 Hide
    maxcellerate , 19 April 2013 09:48
    I'm wouldn't expect Photoshop benchmarks to be great, but where are they???
  • 0 Hide
    klimax , 20 April 2013 13:47
    There's missing info in this article. It doesn't say in what mode Titan runs! It looks like default Single precision has been left in, but for compute applications Double precision mode should have been used. And that could cause quite big difference. (1/3 or 1/24 is large difference in DP)