Nvidia GeForce GTX 1070 8GB Pascal Performance Review

A couple of weeks ago, we introduced the world’s fastest desktop-oriented GPU in our Nvidia GeForce GTX 1080 Pascal Review. Reactions were mixed, not surprisingly—any time you deal with a topic as contentious as graphics, half of the audience walks away impressed, while the other half decries a lack of progress. But there’s no arguing the benchmarks. GeForce GTX 1080 beats GeForce GTX Titan X across the board. It beats Radeon R9 Fury X across the board. And there’s a good chance the 1080 will go uncontested for much of 2016 as AMD prepares its Polaris architecture for a run at the more volume-oriented segment.

Where Nvidia seemed to step in it was the decision to charge an extra $100 for its reference card, now called the Founders Edition. Many Tom’s Hardware readers are reluctant to pay more than MSRP for the industrial design made popular by its radial fan and windowed shroud, particularly since overclocking appears limited by the cooler’s lack of thermal headroom.

Prepare yourself for the same debate, then, as Nvidia lifts the veil on GeForce GTX 1070 performance. The company tells us the Founders Edition card will sell for $449. Meanwhile, board partners are readying their own designs starting at an MSRP of $379. Availability isn’t expected until June 10th though, so it remains to be seen if those prices hold or if launch-day demand drives them up. Such is the trouble with paper launches.

At least we can be confident that the performance of GeForce GTX 1070 won’t change. The card we have in our lab represents what you’ll have access to in a couple of weeks, and it’s an impressive piece of hardware.

Meet The GeForce GTX 1070

You’ll notice that GeForce GTX 1070 looks almost exactly like the 1080. Indeed, it borrows a lot from its big brother, both electrically and mechanically.

There’s the GP104 GPU, to start. Whereas GeForce GTX 1080 comes equipped with a full GP104 sporting 20 Steaming Multiprocessors across four Graphics Processing Clusters, 1070 sheds a complete GPC, losing five SMs in the process. That leaves it with 15 SMs, or 1920 CUDA cores (vs. 2560) and 120 texture units (vs. 160). Nvidia further detunes its Pascal-based processor by dialing the GPU’s base frequency to 1506MHz and its specified GPU Boost clock rate to 1683MHz (these are 1607MHz and 1733MHz, respectively, in the GTX 1080).

Nvidia doesn’t touch the chip’s back-end. You still get eight 32-bit memory controllers with eight ROPs and 256KB of L2 cache bound to each. In total, that’s 64 ROPs and 2MB of L2. But whereas the GeForce GTX 1080 sports 8GB of 10 Gb/s GDDR5X, 1070 gets 8GB of 8 Gb/s GDDR5 from Samsung. Memory bandwidth consequently peaks at a nice round 256 GB/s, or 14% higher than GeForce GTX 980. The GeForce GTX 980 Ti and Titan X actually benefit from more throughput than the 1070 due to their 384-bit interfaces (as do several AMD cards with 384- and 512-bit buses). However, Nvidia maintains that the improved delta color compression we discussed in our GTX 1080 review yields 20% more effective bandwidth by reducing bytes fetched.

GPU
GeForce GTX 1070 (GP104)
GeForce GTX 970 (GM204)
GeForce GTX 1080 (GP104)
GeForce GTX 980 (GM204)
SMs
15
13
20
16
CUDA Cores
1920
1664
2560
2048
Base Clock
1506MHz
1050MHz
1607MHz
1126MHz
GPU Boost Clock
1683MHz
1178MHz
1733MHz
1216MHz
GFLOPs (Base Clock)
5783
3494
8228
4612
Texture Units
120
104
160
128
Texel Fill Rate
201.9 GT/s
122.5 GT/s
277.3 GT/s
155.6 GT/s
Memory Data Rate
8 Gb/s
7 Gb/s
10 Gb/s
7 Gb/s
Memory Bandwidth
256 GB/s
196 GB/s and 28 GB/s320 GB/s
224 GB/s
ROPs
64
56
64
64
L2 Cache
2MB
1.75MB
2MB
2MB
TDP
150W
145W
180W
165W
Transistors
7.2 billion
5.2 billion7.2 billion
5.2 billion
Die Size
314mm²398mm²314mm²398mm²
Process Node
16nm28nm16nm
28nm

Aside from the model name etched into its shroud, the GeForce GTX 1070 Founders Edition looks just like the 1080, and that’s good. Dating back to the GTX 690, Nvidia’s industrial design influenced what we expect from a high-end graphics card. And although I personally prefer the cleaner lines of generations past to this new faceted body, it’s more important that cooling and acoustics are well-executed. We expect board partners to offer configurations with superior thermal capacity. However, those models typically exhaust waste heat back into your chassis. Some enthusiasts are fine with that and design their PCs accordingly. But as the owner of a small form factor gaming system, I have to be more discerning. Nvidia’s radial fan pushes heated air through aluminum fins and out the I/O bracket.

Under the shroud, differences between the 1070 and 1080 become more apparent. Whereas the 1080 employs a vapor chamber solution, 1070 sports an aluminum heat sink with three embedded copper heat pipes. Almost assuredly this is a cost-cutting measure related to the 1070’s 150W TDP. A lower-power card simply doesn’t need such a beefy cooler, even if it would undoubtedly help GeForce GTX 1070 overcome some of the thermal limits we saw the 1080 hit.

Down at the PCB level, Nvidia implements a four-phase dual-FET design instead of the 1080’s five-phase power supply. Check out the 1070's bare PCB above, and the 1080's below for comparison.

The GTX 1070’s I/O bracket includes the same three full-size DisplayPort 1.3/1.4-ready outputs as 1080, along with one HDMI 2.0b connector and a dual-link DVI port. Up top, you’ll again find two SLI interfaces that accommodate Nvidia’s new high-bandwidth bridges. Nothing changes with regard to SLI support: two GPUs are still the max, although the company will presumably give 1070 owners access to the same unlock key available to anyone with a 1080, enabling three- and four-way setups as well. And GeForce GTX 1070 similarly sports a single eight-pin connector to complement power delivered across the 16-lane PCIe slot.

MORE: Best Graphics Cards
MORE: All Graphics Content

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
6 comments
Comment from the forums
    Your comment
  • lorribot
    If the partner cards hit $350 it would be interesting to see how the 1070 works in SLi compared to teh 1080 at $650.
    0
  • cdabc123
    at 3-400$ i just might have to get one of these
    0
  • JDCalvert
    I'd still love to see these benchmarks against a pair of 980s in SLI (my configuration). The 1080 review's first section talked about the claim that a 1080 could beat two 980s and I'd love to see the difference.
    1
  • SpAwNtoHell
    Ok so nvidia should display the price tag for maxwell titan x better then half price like supermarkets :)) i think titan will rothen on shelfs and depots... As who would want that at 3 times almost the price of 1070 with less features and double power hungry? Oh... I know who wants 4 GB of extra vram but wait what for? As is not used.... Troll intended, as for 970 and lower i think they still have their place on the market... But i am afraid 980 980ti and titan have a awfull faith obviously not making any sense to upgrade if you already own titan x or 980ti till you need to but.. the truth is tho may 2016 debuts as a start for great year for pc we are left a bit of bitter taste?! I just wonder what happens next May ...we better brace for a over the world expensive card maybe 20% more faster the 1080 but for a very hefty price as seems nvidia is closely copying intel( broadwell extreme anyone?)

    To everyone, i am not frustrated even if it looks so but is kind of the naked truth and personnally i need to reflect what i am going to buy or to wait another year?!
    0
  • aztec_scribe
    Seeing as I need to also buy a Vive or a massive OLED 4K tv to make use of these I think I'll wait. Probably get the Vive first and then the card in a year (probably a 1080ti at that point). TV and a stream box in mid to late 2018 if all goes well. Come to think of it, gaming is an expensive hobby!! All in that is likely 3000 euros!!
    0
  • Mcrsmokey
    @JDCALVERT I'd like to see that too, iv a titan x and the benchmarks I seen on this site for the titanx vs the 1080 were ridiculous, they were using lower graphics settings than what I use and there fps was like 30% lower than what I get with the titan x lol. Can't really understand it my titan x way out preforms the benchmarks they were supposedly getting. With the chart the posted my titan x was getting like 40-50 fps on bf4 at 4k with no aa . I'm getting around 100fps at 4k maxed with an oc on my titan x. I'm mean wtf there a big difference there so either there bench mark is bs or iv got some super titan x lolol
    0