Sign in with
Sign up | Sign in

Overclocking: I Want More Than GPU Boost

GeForce GTX 680 2 GB Review: Kepler Sends Tahiti On Vacation
By

The implementation of GPU Boost does not preclude overclocking. But because you can’t disable GPU Boost like you might with Intel’s Turbo Boost, you have to operate within the technology’s parameters.

For example, overclocking is now achieved through an offset. You can easily push the base 3D clock up 100, 150, or even 200 MHz. However, if a game was already TDP-constrained at the default clock, it won’t run any faster. In apps that weren’t hitting the GTX 680’s thermal limit before, the offset pushes the performance curve closer to the ceiling.

Because GPU Boost was designed to balance clock rate and voltage with thermal design power in mind, though, overclocking is really made most effective by adjusting the board’s power target upward as well. EVGA’s Precision X tweaking tool includes built-in sliders for both the power target and the GPU clock offset.

Although GeForce GTX 680’s TDP is 195 W, Nvidia says the card’s typical board power is closer to 170 W. So, increasing the power slider actually moves this number higher. At +32%, Precision X’s highest setting is designed to get you right up to the 225 W limit of what two six-pin power connectors and a PCI Express slot are specified to deliver.  

Using Crysis 2 as our very-consistent test case, we can measure the impact of each different alteration and its effect on performance.

First, we launch a single run of the Central Park level at 1920x1080 in DirectX 11 mode, without anti-aliasing. We get a 72.3 FPS result, and we observe GPU Boost pushing the GeForce GTX 680 between 1071 and 1124 MHz during the run (up from the 1006 MHz base).

The top chart shows that we’re bouncing around the upper end of GK104’s power ceiling. So, we increase the target board power by 15%. The result is a small jump to 74.2 FPS, along with clocks that vacillate between 1145 and 1197 MHz.

Figuring the power target boost likely freed up some thermal headroom, we then increase the offset by 100 MHz, which enables even better performance—76.1 FPS. This time, however, we get a constant 1215 MHz. Nvidia says this is basically as fast as the card will go given our workload and the power limit.

So why not up the target power again? At 130% (basically, the interface’s 225 W specification), performance actually drops to 75.6 FPS, and the graph over time shows a constant 1202 MHz. We expected more performance, not less. What gives? This is where folks are going to find a problem with GPU Boost. Because outcome is dependent on factors continually being monitored, performance does change over time. As a GPU heats up, current leakage increases. And as that happens, variables like frequency and voltage are brought down to counter a vicious cycle.

The effect is similar to heat soak in an engine. If you’re on a dynamometer doing back to back pulls, you expect to see a drop in horsepower if you don’t wait long enough between runs. Similarly, it’s easy to get consistently-high numbers after a few minute-long benchmarks. But if you’re gaming for hours, GPU Boost cannot be as effective.

Our attempt to push a 200 MHz offset demonstrates that, even though this technology tries to keep you at the highest frequency under a given power ceiling, increasing both limits still makes it easy to exceed the board’s potential and seize up.

Sliding back a bit to a 150 MHz offset gives us stability, but performance isn’t any better than the 100 MHz setting. No doubt, it’ll take more tinkering to find the right overclock with GPU Boost in the mix and always on.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 13 comments.
This thread is closed for comments
  • 2 Hide
    graemevermeulen , 22 March 2012 21:07
    Wow. Knew it would be a beast, and for such a good price too. Well done Nvidia!
  • 3 Hide
    LLL , 22 March 2012 21:41
    well done NV! cant wait NV's 660 serie.
  • 1 Hide
    jemm , 23 March 2012 03:41
    It is a monster!
  • -2 Hide
    silicondoc_85 , 23 March 2012 03:44
    I love it, the GTX 680 wins and in page after page our red c angel tells us it's not meaningful as all cards can do.
    He even calls the GTX590 single card king defunct !
    The bias is so bad, and stinks to high heaven, but the review is blind as a bat to his red fanboy bloviating he would have us believe.
    I'm sure he will 100% excuse himself as he has before claiming rabid red fans complain so he's doing a good job.
    The bias is beyond disgusting anyway.
    As we will see the soon Crysis 1 and Metro 2033 will be the favorite games of all. All the rest will be described as " capable of being run on any card offering from either competitor so a dead match washout".
    This is the very definition of the red raging fanboy. When they lose in every fashion and form, suddenly everything is a tie and all cards are good, as they are all capable.
    You will never see the reverse done for Nvidia, it does not happen here, nor almost anywhere else.
    The real truth is more than apparent.
    The 680 is about 20% faster across the board, costs less, lower thermals and wattage, smaller core, with an immensely larger feature set that now the raging reds will have extreme trouble lying about over and over again as they have for years claiming "they don't care about any of it and it all sucks".
    Worse yet for them the 7970 loses in triple monitor gaming as well, the big fat ram lie is kicked to the curb as it should have been YEARS AGO.
    The GTX590 still has the single card crown these low life liars still cannot admit as we see it in so many of these reviews - but for the little lying amd fan, "it was not sufficiently dethroned" and for this reviewer as it shows up on top or under the 680 over and over again, it is "defunkct" - THAT'S RIGHT CHRIS IT IS ACCORDING TO YOU DEFUNKCT EVEN AS YOU INCLUDE IT IN YOUR BENCHMARKS.
    ---
    thanks for the thuosands of endless lies in red fanboy favor. thanks so much !
    I'm sure you all have nvidia cards in your own personal systems a well ! I know you do that proves you aren't a lying sack.
    way to go...
  • -4 Hide
    silicondoc_85 , 23 March 2012 03:54
    I can't even read this CRAP without getting furious.

    " The GeForce GTX 680 takes second place in Skyrim at 1680x1050 and 1920x1080. But across-the-board performance is so good that the victory isn’t entirely meaningful.

    (second means the top card GTX dual 590 wins, while the 680 smokes the lame 7970)

    Frame rates slow down just enough at 2560x1600 that the Radeon HD 6990 sneaks past Nvidia’s new card.

    ( but here cannot bring himself to state that once again the GTX590 BEATS THE 6990 AND TAKES FIRST PLACE - ALWAYS WORDED IN A GIANT BIAS FOR THE RED CARDS FAVOR - ALWAYS WITHOUT EXCEPTION)

    Again, though, neither the 6990 nor the GTX 590 are even for sale anymore, so their significance is largely symbolic. Frankly, I’m glad to see them go."

    ( Finally after blowing it, putting down the 680 as second place, then noting only the 6990 in name "beating the 680" and never mentioning the gtx590 beating the 6990, chris the biased perp dismisses the 6990 loss and the 590 win and proclaims he's glad to "see them go" - since of course his purposes for biased fanboyism have been served)

    I mean you HAVE TO BE BLIND NOT TO SEE IT ON PAGE AFTER PAGE, AND CHRIS ANGELI IS DEFINITELY BLIND.
  • 4 Hide
    Anonymous , 23 March 2012 03:55
    @silicondoc_85
    Might be my fault cause of a fever, but after reading your post twice I still don't get what's your point.

    Apart from that, this card gives some nice insight into what Kepler has to offer.
    I'm still waiting to see other models before making any decisions.
  • 3 Hide
    Marsas , 23 March 2012 04:39
    Wow, I was just waiting for Nvidia cards to show up so AMD would lower the price of their high end cards (7950 and 7970) and I could buy one, but after this, I don't know how much the price should drop to make up for performance difference against this card.
  • -3 Hide
    silicondoc_85 , 23 March 2012 04:55
    More sick red bias of unbelievable proportion on power efficiency.

    " We set the GTX 580 as 100%, and the rest of the results speak for themselves. "

    (He tries to get away with saying nothing here, given the 172% GTX680 massive, massive win, but can't do it, so spinning is absolutely required below)

    The Radeon HD 7970 and 7950 both do deliver more performance per watt of power used compared to GeForce GTX 580—and by a significant amount.

    ( First he must brag up his favored brand, slamming down on the Nvidia card)

    But GeForce GTX 680 is like, way up there.

    ( Finally he brings himself to say it - it was very difficult, but the Nvidia win here is so enormous, he had to first try to say nothing, then brag up amd cards against nvidia, then finally in a tiny sentence, quickly mention the 680 being up there. Immediately afterwards, as usual the red fan has to discount and deny this massive accomplishment as this site and red fans and he has been pushing amd power efficiency wins down everyone's throat for the last two years solid. Now you shall see, it won't matter...)

    As a gamer, do you care about this?

    ( As a red fanboy, this is now "unimportant" when Nvidia scores the massive win, so Chris tells you you really shouldn't care, for the 1st time EVER when it comes to power efficency - when RED LOSES BAd and NVIDIA wins )

    Not nearly as much as absolute performance, we imagine.

    ( THE BIAS reeks again, the Nvidia card the the best absolute performance too, but since AMD can't claim a win on power, we switch up... and omit the dual Nvidia win neing mentioned directly - we go on further pretending that didn't happen, so that we can completely dismiss power/performance this time around, with Chris Angelini's massive red fan bias)

    And I personally doubt I’d ever pay more for a card specifically because it gave me better performance/watt.

    ( Here Chris declares the years of power/perf pushing what they should have been for the last two years here, not a paid for feature - but it was pushed as the reason why AMD cards must be chosen - worse yet Chris pretends you have to pay for it here, WHEN THE WINNING CARD THE NVIDIA GTX680 COSTS LESS AND IS MOST EFFICIENT... as Chris' fanboy minds twists against Nvidia, his analysis twists sideways as well, clearly completely losing tracks of the facts he just discovered in testing, that Nvidia is faster, CHEAPER, AND MORE POWER EFFICIENT ...)

    But with AMD and Nvidia both talking about their efficiency this generation, thanks to 28 nm manufacturing and new architectural decisions, the exercise is still interesting.

    ( LMAO - the last sentence "declars the tie! " after Nvidia's smashing win - the conclusion - they both talk about it, and "it's interesting"... ANOTHER GIGANTIC BIAS IN THE LOSER AMD'S FAVOR )
    ----
    I don't mind a slip here or there, but when entire sections are mindlessly biased in favor of AMD, over and over and over again... IT REALLY PISSES ME OFF !

    How about the truth next time Chris ? How about forcing yourself to remove your gigantic amd bias in your thoughts and typing... ? Here I WILL FIX IT
    --------------------------

    The GeForce GTX 680 won the power efficiency test by an enormous margin,as you see above. Nvidia’s and AMD’s respective new architectures diverge greatly here as Nvidia is faster and more power efficient, everything we've been promoting about AMD's card for years.
    The tables have more than completey turned.

    The Radeon HD 7970 and 7950 both do deliver more performance per watt of power used compared to AMD's prior generation, but they cannot come close to the GeForce GTX 680 which is so good we have to make the same recommendation on it we have for years for the AMD cards that were no where near this good.

    As gamers, we've cared about for years on this site, often placing it as the purchase decision above absolute performance.
    And I personally promoted amd cards specifically because they gave better performance/watt.

    With Nvidia now talking about this like AMD has for years, we can't just suddenly dismiss this as we'd like to and tell you it means nothing.

    (although they/Chris did of course in the real article)

  • 2 Hide
    jakjawagon , 23 March 2012 05:23
    Quote:
    anything below 60 Hz has to still be a multiple of 60


    This would be impossible and makes no sense. I think you meant 'factor'. /pedant
  • 2 Hide
    Anonymous , 23 March 2012 14:53
    Is there any reason why there are no eyeinfinity tests? You already said that nivida now supports 3ways + screens so lets see it. I want to see Battlefield 3 across 3 displays with ultra settings...

    Also gamers who can afford this level of hardware don't want to know about resolutions of 1680x1050...
  • 1 Hide
    santfu , 24 March 2012 08:39
    I'm not sure that silicondoc is feeling very well. The only fanboyism that i see is from silicondoc.

    I agree with Ashleyh for top end cards now 1680 x 1050 is a pointless test. For the cost of these cards you get some very nice 1920 x 1080 monitors or 3 not so nice ones.
  • -1 Hide
    dizzy_davidh , 24 March 2012 11:47
    silicondoc_85I love it, the GTX 680 wins and in page after page our red c angel tells us it's not meaningful as all cards can do.He even calls the GTX590 single card king defunct !The bias is so bad, and stinks to high heaven, but the review is blind as a bat to his red fanboy bloviating he would have us believe.I'm sure he will 100% excuse himself as he has before claiming rabid red fans complain so he's doing a good job.The bias is beyond disgusting anyway.As we will see the soon Crysis 1 and Metro 2033 will be the favorite games of all. All the rest will be described as " capable of being run on any card offering from either competitor so a dead match washout".This is the very definition of the red raging fanboy. When they lose in every fashion and form, suddenly everything is a tie and all cards are good, as they are all capable.You will never see the reverse done for Nvidia, it does not happen here, nor almost anywhere else.The real truth is more than apparent. The 680 is about 20% faster across the board, costs less, lower thermals and wattage, smaller core, with an immensely larger feature set that now the raging reds will have extreme trouble lying about over and over again as they have for years claiming "they don't care about any of it and it all sucks".Worse yet for them the 7970 loses in triple monitor gaming as well, the big fat ram lie is kicked to the curb as it should have been YEARS AGO.The GTX590 still has the single card crown these low life liars still cannot admit as we see it in so many of these reviews - but for the little lying amd fan, "it was not sufficiently dethroned" and for this reviewer as it shows up on top or under the 680 over and over again, it is "defunkct" - THAT'S RIGHT CHRIS IT IS ACCORDING TO YOU DEFUNKCT EVEN AS YOU INCLUDE IT IN YOUR BENCHMARKS.---thanks for the thuosands of endless lies in red fanboy favor. thanks so much !I'm sure you all have nvidia cards in your own personal systems a well ! I know you do that proves you aren't a lying sack.way to go...

    I have to agree. whenever there is an nVidia review at TomsHardware it's test results never come out as good as they should be, being either close to the manufacturers own or my personal results.

    As for the 590 being slower fps-wise in games like BF3 than the 680, nVidia themselves have results that show that a 590 will beat a 680 so again your results are crap!

    I can only think that your test setup is flawed in some way or you simply have no idea what you are talking about (I suspect the latter).
  • 0 Hide
    SSri , 23 April 2012 22:55
    GTX 680 comes out extremely poor in computing performance scoring almost just a third of GTX 580's! Is there any pun intended in that card? I can't believe that a top of the line card can be so poor in computation..It is pretty fishy...