AMD Ryzen 7 1700 CPU Review

Power Consumption & Temperatures

Direct Comparisons of Power Consumption

The 65W 1700 consumes slightly more power at idle than the 95W Ryzens.

Meanwhile, our mildly overclocked Core i7-6900K consumes more power at idle than its stock configuration because we reduced its single-core Turbo Boost frequency to achieve a 3.9 GHz clock rate. 

The 1700's power consumption is impressive during the AutoCAD 2015 workload; it only consumes 29.3W. A stock Core i7-7700K uses considerably more power. But looking at these figures on their own can be misleading. Remember that Intel's top Kaby Lake-based CPU demonstrated a commanding lead in the previous page's AutoCAD workloads, so it ends up offering superior performance per watt.

AMD's Ryzen 7 1700 proves its frugal nature by drawing only 44.3W during our gaming benchmark. The -6900K consumes less power than Intel's Core i7-7700K, likely because the workload doesn't fully utilize all eight cores.

The 32nm FX-9590 is in a class of its own, which isn't a good thing. Still, it highlights one of the 14nm process' main advantages.

Prime95's Small FFT stress test pushes power consumption to the max, revealing one of the 65W 1700's best attributes: it consumes 23.3W less than the 1700X. The 1700's modest power use, coupled with a small performance delta between it and the 1700X in our application benchmarks, paints a convincing picture of efficiency.

Temperatures

We optimized our CPU cooler for Socket AM4 by using two nuts between the spring and bracket to increase the force on the package to 0.4Nm. That is why these results differ from those in our launch article, where we only used washers.

Both AMD and Intel employ different temperature measurement methodology. While these readings aren't entirely comparable, they do serve as a close approximation.

In its stock form, Ryzen 7 1700 runs cooler than the rest of the field due to its lower TDP. Of course, all bets are off once you start overclocking and dialing in higher voltages. In any case, AMD uses solder between its die and heat spreader, which generally provides better thermal transfer than thermal paste. Intel famously uses thermal paste and contends that it boosts processor longevity.

Even though our results aren't apples-to-apples, it's clear that the 1700's 65W TDP does convey an expected power and thermal advantage over the other Ryzen 7s. 

This thread is closed for comments
68 comments
    Your comment
  • mitch074
    And how about testing with some AMD GPU? Seems Ryzen gets the short hand of the stick when using an Nvidia GPU... https://forums.overclockers.co.uk/threads/nvidia-dx12-driver-holding-back-ryzen.18774744/
  • Sakkura
    Why are the graphs blurry?
  • envy14tpe
    Why no 1440p or 4k gaming? Who buys 1700 for 1080p gaming? In gaming, the new 1700, 1700x, 1800 don't compare to 7700k in gaming. But i don't see that like all other testing methodologies done by likes of gamernexus and what not.

    EDIT. Based on the downvoting of this comment it seems AMD lovers are a little butt hurt.
  • ykki
    Thanks for the review. Will Tom's bench (or has already benched) the R5s with AMD GPUs? (i5 + 1060, i5 + 480, R5 + 1060, r5 + 480)?
  • PaulAlcorn
    699111 said:
    Why no 1440p or 4k gaming? Who buys 1700 for 1080p gaming? In gaming, the new 1700, 1700x, 1800 don't compare to 7700k in gaming. But i don't see that like all other testing methodologies done by likes of gamernexus and what not.


    Here is some recent testing at 1440p. It includes the 1700, as well.

    http://www.tomshardware.com/reviews/amd-ryzen-vs-intel-kaby-lake-gaming,4977.html
  • envy14tpe
    1920539 said:
    699111 said:
    Why no 1440p or 4k gaming? Who buys 1700 for 1080p gaming? In gaming, the new 1700, 1700x, 1800 don't compare to 7700k in gaming. But i don't see that like all other testing methodologies done by likes of gamernexus and what not.
    Here is some recent testing at 1440p. It includes the 1700, as well. http://www.tomshardware.com/reviews/amd-ryzen-vs-intel-kaby-lake-gaming,4977.html


    That shows the new AMD cpus as is. From all I see the i7-7700k blasts the new AMD 1700, 1700x, 1800x series at 1440p +. That's important to keep in mind for gamers that want the most out of a CPU n high end GPU.
  • ddpruitt
    Quote:
    even if Ryzen isn't shaping up to be universally superior, as many hoped prior to launch.


    Quote:
    This makes it difficult to universally recommend those high-end parts.


    Why do they have to be universally superior? They do a killer job on highly threaded workloads and are a lot cheaper than equivalent Intel. Sure gamings a wash but they're all playable. Aiming for universally superior is shooting for the moon and doesn't happen even with a single Intel chip.

    Quote:
    But looking at these figures on their own can be misleading. Remember that Intel's top Kaby Lake-based CPU demonstrated a commanding lead in the previous page's AutoCAD workloads, so it ends up offering superior performance per watt.


    Any chance you can multiply the numbers out so we can compare the differences?
  • TJ Hooker
    So I have to ask, is there any reason to buy a 1700X/1800X over a 1700 if you're comfortable with overclocking?
  • Ian_85
    Can you please repeat this test after each of the Ryzen bios updates in April and May?

    I think people would be interested to show just how much performance in a new CPU architecture improves in the months after its initial release.
  • elbert
    1920539 said:
    699111 said:
    Why no 1440p or 4k gaming? Who buys 1700 for 1080p gaming? In gaming, the new 1700, 1700x, 1800 don't compare to 7700k in gaming. But i don't see that like all other testing methodologies done by likes of gamernexus and what not.
    Here is some recent testing at 1440p. It includes the 1700, as well. http://www.tomshardware.com/reviews/amd-ryzen-vs-intel-kaby-lake-gaming,4977.html

    I dont believe that has Ashes of the Singularity updated tests. Good review and I would like to see more. Now that all the Ryzen's have been benchmarked on 1080p maybe 1440p and 4k would make a good review. With and without SLI/crossfire also just to see how it works for Ryzen. Possibly Gskills could pitch in some of their Flare X 3466 RAM for Ryzen.
  • max0x7ba
    RAM is the bottleneck in modern systems: the CPU can process data faster than it can be read and written from/to RAM.

    The benchmarks use @2400MT/s RAM for Intel and @2666MT/s for Ryzen. It is 11% difference RAM speed. How is that a valid comparison?

    I use 7700K with @4000MT/s RAM, for example.
  • Corey_38
    Don't know how to quote but in reply to the ram question, I've read FPS barely scales with ram speed on intel but does cale very well on ryzen.
  • TJ Hooker
    2018360 said:
    RAM is the bottleneck in modern systems: the CPU can process data faster than it can be read and written from/to RAM. The benchmarks use @2400MT/s RAM for Intel and @2666MT/s for Ryzen. It is 11% difference RAM speed. How is that a valid comparison? I use 7700K with @4000MT/s RAM, for example.

    I have seen little to suggest that modern Intel CPU performance is particularly dependent on RAM speed, especially past a certain threshold.

    Ryzen on the other hand seems to love high speed memory, which is apparently in part due to higher speed RAM actually increasing the speed of the infinity fabric between CCXs.
  • max0x7ba
    1636679 said:
    2018360 said:
    RAM is the bottleneck in modern systems: the CPU can process data faster than it can be read and written from/to RAM. The benchmarks use @2400MT/s RAM for Intel and @2666MT/s for Ryzen. It is 11% difference RAM speed. How is that a valid comparison? I use 7700K with @4000MT/s RAM, for example.
    I have seen little to suggest that modern Intel CPU performance is particularly dependent on RAM speed, especially past a certain threshold. Ryzen on the other hand seems to love high speed memory, which is apparently in part due to higher speed RAM actually increasing the speed of the infinity fabric between CCXs.


    Well, I am a software engineer involved with low latency systems and once your dataset does not fit in L3 cache, or the memory access pattern is not cache-friendly, the memory transfer rates are quite observable.

    Also, have a look at http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/

    The memory benchmarks on Tom's Hardware recently have been of rather poor quality.
  • refillable
    Looking by the fact that Tom's hardware reviews has pretty much been hit and miss lately, this one actually stands out amidst other gaming oriented reviews. This review is definitely much more comprehensive than the first one. I must say the results are as expected though. Here's some things that I found out interesting for you guys to (potentially) discuss.

    -I noticed that RAM speeds across the Intel and the AMD platforms were running at a different data rate. While the difference has been proven to not show any disparencies whatsoever, it'll be fair if they all run at the same speeds.

    -The latest Ashes of the Singularity patch gives double digit gains to all Ryzen CPUs, but not enough for the 1800X to catch up with the 6900k. Which, supposedly has around the same single threaded performance. Interesting.

    -Please replace Battlefield 4 and Project Cars. Aren't there any other, better games out there? Why on earth are you still using them as a gaming benchmark in 2017?

    -All Ryzen CPUs has a fairly consistent result and achieves a respectable performance across most titles, except Rise of the Tomb Raider. You should look more into this.

    -The Workstation benchmarks are hit and miss for Ryzen. Intel might have been handicapped here, but so far, Ryzen does not seem like the fastest all around content creator as of now. Though, I have to admit that it's still disappointing that a quad core often tops everything else at those chart. Ryzen definitely has its strengths though.

    -The power figures are mindbogglingly good. Global Foundries did its job very very well. But I must point out that the results here are a little bit too good compared to other reviewers, but not far off. It might have indicated that Tom's received great Ryzen samples.
  • kiniku
    I'm pleased to see some very viable competition to Intel. But then let's not forget a phrase Tom's coined regarding CPUs and gaming: "The point of diminishing returns". And yes, I look forward to what the Ryzen 5's will do in gaming and everything else.
  • WoWFishmonger
    Paul, their has been some testing that was done and seemed to display that the DX12 drivers for Nvidia GPU's are crippling the performance of multi core CPU's.

    Lets remember here... you are doing a CPU test.
    You are testing games on the CPU.
    The potential of the CPU in those games is heavily affected by the GPU.
    The success and efficiency of the GPU is directly affected by its DRIVER.
    As is stands right now, Nvidia drivers for DX12 are inferior [or so it seems?]

    Since you are doing a performance comparison of the Ryzen CPU vs and Intel CPU, you should also consider using a different GPU from a different vendor [AMD perhaps?] and compare the results. Hell, if Intel has a video card with DX12 support throw that in the mix as well.

    I would be interested to see if you can confirm the findings of other testers. TBH I don't care if AMD does better or worse in those tests, I just want to know for a FACT if the Nvidia DX12 drivers are affecting the CPU performance.

    As does everyone else in this world......
  • Gillerer
    1636679 said:
    So I have to ask, is there any reason to buy a 1700X/1800X over a 1700 if you're comfortable with overclocking?


    AMD probably does some die harvesting: dies that clock well with moderate voltages are sold as 1800Xs and ones that don't clock without significantly increasing the voltage, but have good voltage characteristics at lower clock speeds, are sold as 1700s.

    This means that your chance of getting good clocks with moderate voltages are better the higher tier CPU you buy. This was illustrated in the article as well, when they couldn't overclock the 1700 as high.
  • rwinches
    Lets see triple monitor 1080p testing. Maybe 144Hz.
    More Vulcan games too.
    Radeon cards and Freesync monitors.
  • TJ Hooker
    1423473 said:
    1636679 said:
    So I have to ask, is there any reason to buy a 1700X/1800X over a 1700 if you're comfortable with overclocking?
    AMD probably does some die harvesting: dies that clock well with moderate voltages are sold as 1800Xs and ones that don't clock without significantly increasing the voltage, but have good voltage characteristics at lower clock speeds, are sold as 1700s. This means that your chance of getting good clocks with moderate voltages are better the higher tier CPU you buy. This was illustrated in the article as well, when they couldn't overclock the 1700 as high.

    I understand about the binning (although I did miss the fact that Tom's got an extra 100 MHz out of the 1800X, thanks for pointing that out), but from the reviews I've seen it doesn't seem to make much difference. Seems like the 1800X overclocks maybe an extra 100-200 MHz at most (and I've seen at least one example where the 1700 overclocked higher than the 1800X). For an extra $170, doesn't seem to make a lot of sense.
  • TJ Hooker
    2018360 said:
    Well, I am a software engineer involved with low latency systems and once your dataset does not fit in L3 cache, or the memory access pattern is not cache-friendly, the memory transfer rates are quite observable. Also, have a look at http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/ The memory benchmarks on Tom's Hardware recently have been of rather poor quality.

    Wow, yeah performance scales quite noticeably in Handbrake and Adobe CC. Although, not surprisingly, benefits from high speed RAM do seem very application specific.
  • kckrich
    From the intro: " You also save a few bucks with the bundled 95W Wraith Spire cooler, and although we wouldn’t recommend using the stock heat sink for overclocking, it’s a nice addition."

    Paul, how can you guys state that you don't recommend overclocking on the bundled cooler if you have not tested on it yet? 95W should be more than capable of handling a 65W processor?

    Are you guys planning on talking about the coolers, or giving any performance data on them in the future? I would love to see this as very few sites even gave the coolers a chance and have heard good things from anyone who has actually used one.

    Thanks! (this may be a double post, can't find my other)
  • Wisecracker
    Way-to-go, AMD. Look forward to even better goodies, though a Ryzen 1700 on an X370 ITX (<---- especially one with DP-out) will be big fun, I bets. Not unlike the big fun when Don buys everyone a cold beverage (no ...it could happen, really ...)

    The greatest thing is, AMD has slayed the node-shrink Bear and new platform launch relatively unscathed. That unfortunately has not always been the case in the past, and speaks well to their preparation and execution with GloFo and all their partners this go-round.

    Makes me all tingly thinking about respins, new steppings, APUs, SFFs, mobiles, interposers ... :lol:
  • pecul1ar
    Posting this again, as it gets asked quite often in other forums. Owners of older i5/i7 want to know how far off are they, at least against Ryzen. So mayhaps a followup review, after you guys get the R5?