Page 1:Mining Ethereum: GPU Power Consumption
Page 2:Six Useful Tips For Configuring Mining Rigs
Page 3:Test Results: Power Consumption For Mining & Gaming
Page 4:Test Results: Ethereum Hash Rate
Page 5:Test Results: Performance Per Watt And Dollar
Page 6:Test Results: Fan Speeds & Temperatures (Mining & Gaming) & Conclusion
Test Results: Power Consumption For Mining & Gaming
Compared to previous GPU generations, AMD's and Nvidia's 14nm/16nm offerings dramatically improve performance at every given power target. In short, they're far more efficient than anything that came before.
Power consumption means everything to a graphics card, since it affects performance, noise output, and reliability (which is directly tied to operating temperature). The hotter a graphics card is allowed to get, the more stress its components are forced to endure.
This, in a nutshell, is why all modern GPUs will throttle before crossing a manufacturer-defined threshold. By pulling back on voltage and frequency, performance is sacrificed. But that's better than compromising the board's health.
In an effort to quantify the behavior of modern graphics cards, we combine the data-collection capabilities of Powenetics, logging in GPU-Z, and a handful of workloads able to utilize GPUs in different ways. The polling rate for our Powenetics measurements is 2ms (0.002s), while GPU-Z reads every 100ms (0.1s). The workloads comprise the following:
- Average (Game): We run Metro: Last Light at 1920x1080 and measure the average of all readings while the game's benchmark is rendering (no title/loading screen). We run the benchmark once to allow the graphics card to reach a proper operating temperature before proceeding with our measurements.
- Peak (Game): We use the same settings as above, only this time we take into account the single peak wattage reading that our equipment records.
- Average (FurMark): We run FurMark's Stability Test at 1600x900 resolution with 0xAA, measuring the average of all readings during a 10-minute period. FurMark applies an unrealistic GPU load, though it does represent a worst-case scenario.
- Peak (FurMark): Same as above, only we record the peak wattage reading during our 10-minute test.
- Average (Mining): We use Claymore's Dual Ethereum AMD+Nvidia GPU Miner to mine Ether for 10 minutes, and measure the average of all readings. We usually conduct two tests, one with the graphics card at stock clocks and one with settings optimized for mining.
- Peak (Mining): We note the peak power reading during our 10-minute mining session. As mentioned, we usually run two tests, one with the graphics card at stock clocks and one with optimized-for-mining settings. At least to start, though, we have only stock configurations.
For all measurements, we maintain an ambient temperature around 70°F (21.1°C) with a humidity level close to 35%.
The GeForce GTX 1060 6GB maintains low relative power consumption, needing ~100W during our mining workload. In comparison, Nvidia's GeForce GTX 1080 Ti Founders Edition registers the highest use in every test, exceeding the Nvidia Titan Xp's power consumption. Finally, the Galax GTX 1080 EXOC hits the highest peak power consumption at 270.37W, during FurMark.
If you want to keep power consumption low, you'd ideally have several GeForce GTX 1060 6GB Founders Edition graphics cards installed. With each needing only one PCI Express connector, you can install up to six with a good 850W PSU or 10 with a 1.2kW PSU (and enough PCIe connectors). When it comes to mining, the GeForce GTX 1080 isn't particularly power-hungry. But as you'll see on the next page, its hash rate is low. You're better off with a GeForce GTX 1060, GTX 1070, or GTX 1070 Ti. (If you can afford it, the GTX 1080 Ti is an option, as well.)
MORE: Best Graphics Cards
MORE: All Graphics Content