Gigabyte GeForce GTX 950 Xtreme Gaming Review

Overclocking, Noise, Temperature And Power

When it comes to overclocking, nothing beyond the factory frequencies are guaranteed. Picking a card like Gigabyte's GeForce GTX 950 Xtreme Gaming, which sports a cherry-picked GPU, certainly helps. But it still comes down to the silicon lottery.

OC Guru II

My go-to utility for overclocking graphics cards is MSI's Afterburner, which was designed to support almost any GPU. But when companies go through the effort of creating their own tools, I like to give them a shot.

Gigabyte provides OC Guru II with its GeForce GTX 950 Xtreme Gaming card. This software lets you monitor temperatures, clock rates, voltage levels and fan speeds. OC Guru II also enables GPU clock, memory clock, core voltage and fan speed adjustments. Plus, it gives you control over the Windforce logo's LEDs.

Much like Afterburner, OC Guru II lets you adjust the GPU and memory frequencies in granular steps. Every click on the arrow buttons move the clock rate by 1MHz, and voltage adjustments are applied in 0.0125V increments.

Unlike most GPU overclocking tools, the temperature and power target values are not linked by default. You're able to increase the power target an additional eight percent, and the temperature can be increased by 15 degrees Celsius to a threshold of 95 degrees. Either option can be maxed out without affecting the other. OC Guru can also be configured to prioritize the GPU temperature over the power limit.

Overclocking

To overclock Gigabyte's GeForce GTX 950 Xtreme Gaming, we first adjusted the power limit. We didn't bother with the temperature limit because the GPU wasn't even approaching 80 degrees under extreme load. From there, we played with the core clock in 5MHz increments until the card crashed, after which point we used single-MHz tweaks. Ultimately, we saw stability at 1273MHz, or 70MHz higher than Gigabyte's shipping frequency. Increasing the voltage a few notches proved to be no help in achieving higher clock speeds.

Gigabyte includes the same memory found in Nvidia's GeForce GTX 960 specification. With the GDDR5 already running at its peak rate, overclocking was bound to be a struggle. In the end, we achieved an additional 20MHz. 

The overclocked settings were stable, except for in Shadow of Mordor. For some reason, that title penalizes our tuned configuration with lower performance than the factory setup. There were no crashes or artifacts, just a lower frame rate.

Noise, Power and Temperature

Noise

Gigabyte's GTX 950 may have outperformed Asus in the gaming tests, but if you want the quietest card available, Gigabyte does fall short. A reading of 39 decibels is certainly not loud, and you probably won't notice it. However, the measurement is still higher than Asus' card.

Interestingly, before the driver was installed, the card's fans were spinning particularly quickly, causing a slight "tinging" sound. There were no indications of a loose component or bad bearing, so I suspect it was caused by vibrations from wind turbulence. During normal operation, this behavior was not observed.

Power

The power consumption numbers were a little bit surprising. The consequence of overclocking was expected, but presenting the lowest idle consumption figure at stock speeds was not. Under load, Gigabyte's card uses quite a bit of power. At idle, though, it registers 1.5W less than Asus' GTX 950 Strix.

Temperature

Gigabyte's Windforce cooling solution does a great job keeping the GPU temperature under control. During a 10-minute run of Battlefield 4, the processor barely crept up over the 60-degree mark. Curiously, the GPU on Gigabyte's card heated up rapidly and then stabilized once the fan kicked on. In contrast, the GTX 950 Strix took a couple of minutes to heat up, after which it maintained a higher peak temperature.

Even after we applied our overclocked settings, the GeForce GTX 950 Xtreme Gaming managed to hold essentially the same temperature pattern through the test.

MORE: Best Graphics Cards
MORE: All Graphics Content

This thread is closed for comments
28 comments
    Your comment
  • chaosmassive
    for future benchmark, please set to 1366x768 instead of 720p as bare minimum
    because 720p panel pretty rare nowadays, game with resolution 720p scaled up for bigger screen, its really blur or small (no scaled up)
  • kcarbotte
    Quote:
    for future benchmark, please set to 1366x768 instead of 720p as bare minimum because 720p panel pretty rare nowadays, game with resolution 720p scaled up for bigger screen, its really blur or small (no scaled up)


    All of the tests were done at 1366x768.
    Where do you see 720p?
  • rush21hit
    I have been comparing test result for 950 from many sites now and that leaves me to a solid decision; GTX 750Ti. I'm having the aging HD6670 right now.

    Even the bare bone version still needed 6pin power and still rated 90Watt, let alone the overbuilt. As someone who uses a mere Seasonic's 350Watt PSU, I find the 950 a hard sell for me. Add in CPU OC factor and my 3 HDD, I believe my PSU is constrained enough and only have a little bit more headroom to give for GPU.

    If only it doesn't require any additional power pin and a bit lower TDP.
    Welp, that's it. Ordering the 750Ti now...whoa! it's $100 now? yaayyy
  • ozicom
    I decided to buy a 750Ti past but my needs have changed. I'm not a gamer but i want to buy a 40" UHD TV and use it as screen but when i dig about this i saw that i have to use a graphics card with HDMI 2.0 or i have to buy a TV with DP port which is very rare. So this need took me to search for a budget GTX 950 - actually i'm not an Nvidia fan but AMD think to add HDMI 2.0 to it's products in 2016. When we move from CRT to LCD TV's most of the new gen LCD TV had DVI port but now they create different ports which can't be converted and it makes us think again and again to decide what to buy.
  • padremaronno
    > mid-low end card
    > extreme gaming
  • InvalidError
    777314 said:
    now they create different ports which can't be converted

    There are adapters between HDMI, DP and DVI. HDMI to/from DVI is just a passive dongle either way.
  • Larry Litmanen
    Obviously these companies know their buying base far better than i do, but to me the appeal of 750TI was that you did not need to upgrade your PSU. So if you have a regular HP or Dell you can upgrade and game better.

    I guess these companies feel like most people who buy a dedicated GPU probably have a good PSU.
  • TechyInAZ
    Looks great! Right off the bat it was my favorite GTX 950 card since Gigabyte put some excellent aesthetics into the card, but I will still go with EVGA.
  • matthoward85
    Anyone know what the SLI equivalent would be comparable to? greater or less than a gtx 980?
  • silverblue
    How is the texture fillrate of the Strix ahead of the higher clocked Xtreme? :)
  • Eximo
    Scaling isn't perfect, but in terms of raw silicon this is what you have. So a pair of 950 would be about a GTX970.

    GTX950 = 6 SM units
    GTX960 = 8 SM Units
    GTX970 = 13 SM Units
    GTX980 = 16 SM Units
    GTX980 TI = 22 SM Units
    Titan X = 24 SM Units
  • logainofhades
    700347 said:
    I have been comparing test result for 950 from many sites now and that leaves me to a solid decision; GTX 750Ti. I'm having the aging HD6670 right now. Even the bare bone version still needed 6pin power and still rated 90Watt, let alone the overbuilt. As someone who uses a mere Seasonic's 350Watt PSU, I find the 950 a hard sell for me. Add in CPU OC factor and my 3 HDD, I believe my PSU is constrained enough and only have a little bit more headroom to give for GPU. If only it doesn't require any additional power pin and a bit lower TDP. Welp, that's it. Ordering the 750Ti now...whoa! it's $100 now? yaayyy


    GTX 950 only requires a 350w PSU, just as an FYI. What CPU do you have?
  • none12345
    Hrm i paid only $10 more then this card for a gigabyte r9 380 4gig 2-3 weeks ago. I think that was the better deal.
  • spentshells
    Again with the itx based 380, there were enough complaints last time to actually do something about that. The slanted playing field is slanted
  • kcarbotte
    53571 said:
    Again with the itx based 380, there were enough complaints last time to actually do something about that. The slanted playing field is slanted


    The ITX 380 is the only R9 380 that I have.
    Nothing I can really do about that until another vendor decides they want to send one.
  • kcarbotte
    Quote:
    Anyone know what the SLI equivalent would be comparable to? greater or less than a gtx 980?


    I've done some test with SLI GTX 950s. Raw performance falls somewhere between a GTX 970 and a GTX 980, but in games that have high memory demands two 950s fall on thier face due to the 2GB frame buffer.
  • TechyInAZ
    1943658 said:
    Quote:
    Anyone know what the SLI equivalent would be comparable to? greater or less than a gtx 980?
    I've done some test with SLI GTX 950s. Raw performance falls somewhere between a GTX 970 and a GTX 980, but in games that have high memory demands two 950s fall on thier face due to the 2GB frame buffer.


    Makes sense. The only advantage to adding a 2nd gtx 950 in SLI would be going from a 60hz 1080P monitor to a 144hz+ refresh rate monitor.
  • kcarbotte
    1864420 said:
    Anyone know what the SLI equivalent would be comparable to? greater or less than a gtx 980?


    agreed.
    If 1080p high refresh is your setup, then two 950s is potentially a very econoical setup.
  • Onus
    Looking at the game settings, you were at or near "ultra," so the GTX950 looks like a great choice for 1080p. I could not help but notice that the GTX750Ti also appeared to be playable (imho) on these settings as well, or so close that only one or two of them might need to be turned down one notch; that's pretty good for ~$45 less money. Those not able or willing to upgrade their PSUs (possibly in OEM boxes) will not be suffering if they have to game on a GTX750Ti.
  • kcarbotte
    47340 said:
    Looking at the game settings, you were at or near "ultra," so the GTX950 looks like a great choice for 1080p. I could not help but notice that the GTX750Ti also appeared to be playable (imho) on these settings as well, or so close that only one or two of them might need to be turned down one notch; that's pretty good for ~$45 less money. Those not able or willing to upgrade their PSUs (possibly in OEM boxes) will not be suffering if they have to game on a GTX750Ti.


    I agree. I bought a GTX 750 Ti last spring and paired it with an i3. The combination did very well and I was more than happy playing games on that setup. That system was meant to be sold but I ended up keeping it for myself. I now use it as a living room gaming PC/ media center.

    The GTX 950 far outperforms it, but there's a reason the 750ti is still in the lineup. It's a great product for the money.
  • knowom
    If your going to get a GTX960 get a 4GB version or don't bother and OC the memory to about 8GHz they don't really show there true muscle until you do that.
  • spentshells
    1943658 said:
    53571 said:
    Again with the itx based 380, there were enough complaints last time to actually do something about that. The slanted playing field is slanted
    The ITX 380 is the only R9 380 that I have. Nothing I can really do about that until another vendor decides they want to send one.


    Yes but this is a hardware site and the crowd has already spoken on this and found it's not a fair shake and it skewed the results so again please find the money for a 380 in regular format or borrow one.
  • rush21hit
    Quote:
    GTX 950 only requires a 350w PSU, just as an FYI. What CPU do you have?


    I know it does. But with 3 HDD and CPU OC with big fan cooler. Also add many other stuff that need power from my PC, lead to this decision.
    My CPU is a very old Q6600 OC @3,6ghz.
  • Novakane_
    Whats the difference between this and the SC version?