Gaming And Streaming: Which CPU Is Best For Both?

AMD Threadripper & Intel Skylake-X i9

You expect the best when you drop $1000+ on a CPU for gaming and streaming simultaneously. So, we shifted to the "fast" setting for this round of tests.

Loading...

Battlefield 1

AMD didn’t design Threadripper for "just" gaming, particularly at lower resolutions. But an intense streaming workload might expose more of the architecture's benefits.

Threadripper does have several configurable modes to tailor to its response to various tasks, as we outlined in our AMD Ryzen Threadripper 1950X Game Mode, Benchmarked article. Game Mode and Creator Mode both impact various titles differently, and we expect those same trends to carry over to our streaming benchmark. As such, we tested the Threadripper 1950X both ways.

The Threadripper 1950X pulls into range of Intel's Skylake-X models during our baseline tests. Creator Mode (CM) exposes all 32 of the 1950X's available threads, yielding more potential horsepower than Game Mode (GM). The 1950X's Creator Mode also makes it possible to encode 100% of the test run's frames, while Game Mode drops 1.6% of them. That isn't a huge sacrifice, but Game Mode also causes Threadripper to average 26 FPS fewer. That means it isn't encoding as many frames, either. We would have expected higher frame rates from a lighter encoding workload, but disabling half of the 1950X's threads in pursuit of higher game performance doesn't always work well when you're streaming, too.

The 18C/36T Core i9-7980XE doesn't perform as well as the 10C/20T Core i9-7900X during the streaming tests. Rather, the -7980XE stumbles in the smoothness department as its 99th percentile frame rates fall below the Threadripper 1950X, despite a lead in average frame rates.

Aside from Threadripper 1950X in Game Mode, all of these processors encode 100% of the frames.

The Core i9-7980XE's game performance is higher than the Threadripper models during our streaming test after bumping up clock rates. It also delivers a much better 99th percentile measurement.

AMD's Threadripper configurations fare better after tuning, too. Both encode 99.9% of the frames we send their way, which is adequate for a quality stream.

Grand Theft Auto V

Grand Theft Auto V's in-game performance goes Intel's way during the baseline benchmarks. Moreover, Core i9-7980XE redeems itself when it comes time to stream. That chip does command quite a premium though, so its advantage doesn't necessarily represent the best value.

Again, Threadripper's Game Mode just doesn't appear to be ideal for streaming. Game Mode provides better baseline frame rates in this title, but it falls to the bottom of our chart once we start encoding video.

Middle-earth: Shadow of War

Once again, Intel's processors offer the best in-game and streaming performance. This isn't entirely surprising; Threadripper may even be overkill for this type of enthusiast workload. We'd expect the architecture to handle workstation-class production workflows more adeptly.

Core i9-7900X does encounter a hiccup as its 99th percentile scores fall below Threadripper 1950X's during the streaming workload. The 1950X in Creator Mode also experiences some variance, with 20.38% of its frames falling below the 16.667ms threshold.

Even after a bit of tuning, Core i9-7900X provides a higher average frame rate than the 1950X in Creator Mode during our streaming benchmark, but can't match AMD's 99th percentile performance.

The Threadripper 1950X's difficulty in Creator Mode while streaming Middle-earth is even more pronounced. Some aspect of OBS doesn't agree with Creator Mode and this one game. We ran the tests several times to ensure it was a repeatable phenomenon.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

This thread is closed for comments
26 comments
    Your comment
  • SonnyNXiong
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.
  • InvalidError
    This benchmark will need a do-over once the patch for Intel's critical ring-0 exploit comes out and slows all of Intel's CPUs from the past ~10 years by 5-30%.
  • mcconkeymike
    Sonny, I disagree. I personally run the 7700k at 4.9ghz, but the 8700k 6 core/12 thread is a badass and does what the 7700k does and better. Please do a little research on this topic otherwise you'll end up looking foolish.
  • AndrewJacksonZA
    125865 said:
    This benchmark will need a do-over once the patch for Intel's critical ring-0 exploit comes out and slows all of Intel's CPUs from the past ~10 years by 5-30%.

    Yeah. It'll be interesting to see the impact on Windows. Phoronix ( https://www.phoronix.com/scan.php?page=article&item=linux-415-x86pti&num=2 ) shows a heavy hit in Linux for some tasks, but apparently almost no hit in gaming ( https://www.phoronix.com/scan.php?page=news_item&px=x86-PTI-Initial-Gaming-Tests )

    Although gaming in Linux, seriously?! C'mon! ;-)
  • ArchitSahu
    2523846 said:
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.


    What about the 8700k? i9? 7820X?
  • AgentLozen
    What a great article. It really highlights the advantage of having more cores. If you're strictly into gaming without streaming, then Kaby Lake (and Skylake by extension) is still an awesome choice. It was really interesting to see it fall apart when streaming was added to the formula. I wasn't expecting it to do so poorly even in an overclocked setting.
  • Soda-88
    A couple of complaints:
    1) the article is 10 months late
    2) the game choice is poor, you should've picked titles that are popular on twitch

    Viewer count at present moment:
    BF1: ~1.300
    GTA V: ~17.000
    ME - SoW: ~600

    Personally, I'd like to see Overwatch over BF1 and PUBG over ME: SoW.
    GTA V is fine since it's rather popular, despite being notoriously Intel favoured.

    Other than that, a great article with solid methodology.
  • guadalajara296
    I do a lot of video encoding / rendering in Adobe cc premier pro
    It takes 2 hours to render a video on Skylake cpu. would a Ryzen multi core improve that by 50% ?
  • salgado18
    1632944 said:
    2523846 said:
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.
    What about the 8700k? i9? 7820X?


    Please, don't feed the trolls. Thanks.
  • lsatenstein
    To be able to respond to Paul's opening comment about reputability testing, the rule is to have at least 19 test runs. The 19 runs will provide a 5 percent confidence interval. That means or could be understood to be 19/20 the results will be within 5 percent of the mean, which is about 1 standard deviation.
  • lsatenstein
    The new flaw discovered with older existing CPUS does not affect your testing above, because your cpus are post flaw detection. But for the average person on the street (me), who does have older Intel CPUs, I just lost between 10% and 20% cpu performance, be it windows or Linux.


    Security
    'Kernel memory leaking' Intel processor design flaw forces Linux, Windows redesign
    Google the title for more info.
  • InvalidError
    702528 said:
    The new flaw discovered with older existing CPUS does not affect your testing above, because your cpus are post flaw detection.

    None of the sources I have found exempt any Intel CPUs from the flaw which appears to affect all of Intel's x86-64 CPUs up to current. It takes several months of R&D past detection to design and manufacture a new chip with the fixes baked-in, which means that unless Intel knew about the flaw 4+ months ahead of Coffee Lake's launch, then all current Intel chips have the flaw.
  • razamatraz
    Wasn't the Ring-0 bug described found in 2015? At the time there was no demonstration that it actually could be abused if I recall correctly but Intel should have taken it seriously when designing Coffee Lake CPUs...not sure if they did; the list of affected CPUs is still under NDA. In any case it seems to have much more effect on machines running VMs, multiple simultaneous users etc than on typical single user gaming machines as you'd have to already be compromised for this bug to have any impact on a single user machine...IE they'd have to get malware on your machine in the first place. This bug doesn't help anyone do that.

    Waiting on actual answers, expected in the next week or two.
  • InvalidError
    1579424 said:
    Wasn't the Ring-0 bug described found in 2015?

    None of the stories I've read about it so far make any mention of the actual date where Intel was made aware of the bug. If Microsoft had a patch ready for such a severe bug (unrestricted access to kernel space from user land) less than two months ago, I'd be surprised if Intel has been sitting on it for over a year: that would be a huge liability for Intel if a company got hacked and found out Intel knew about the vulnerability that enabled the hack long beforehand.
  • Honis
    I like how this shows how great multi-threaded performance can be with multiple proceses running in the background.

    I know when I game, I may not stream but I usually have a browser open and a few sites I frequent have junky autoplay videos. My Ryzen 7 and I would think an i7-8 series, handles these intermittent hiccups nicer than the 4 core processors.
  • Nintendork
    @guadalajara296
    Consider changing to sony vegas, Adobe suite is pretty shi*tty regarding multicore optimization.
  • Yuka
    Thanks for this review. It really does bring nice information to the table.

    Are you going to test with more forgiving settings? I can stream decently with 720p and 30FPS using my i7 2700K (no hard figures though), pretty much every single game out there. And will you guys update the numbers once the security patches are in place? I'd love to see how that affects Streaming.

    Cheers!
  • cryoburner
    512760 said:
    the game choice is poor, you should've picked titles that are popular on twitch... ...Personally, I'd like to see Overwatch over BF1 and PUBG over ME: SoW.

    The problem with games like Overwatch and Battlegrounds is that it would be difficult to get meaningful performance numbers out of them. Those games can only be played online, and there's no real way to control what other players are doing during a match. That's why they pick games where their tests can be performed in single player mode, where there can be relatively little variance between one benchmark run and the next.
  • DerekA_C
    @Cryoburner, the fact is most streamers don't stream single player games they stream multiplayer, something with actual competition, but I do get why they tested it that way. However, it isn't very realistic to stream only single player it has to demonstrate those variances.
  • James Mason
    I wish this had a "dynamic" Price to Performance metric at the end that could compare to the prices you pull from amazon/newegg to how good the CPUs (or GPUs/whatever) to provide an always relevant way to decide what's the best choice for whoever and whenever sees the article.
  • elbert
    On the test setup page why does the Z370 have (4X8GB)32GB while the AM4 only (2x8GB)16GB? Ive seen compression tests with RAM capacity causing performance differances.
  • Plumboby
    Um streaming & recording my trusty old i52400 with 8gig ram & Sapphire pulse RX560 4G is prety impressive for the age of build & the Videocard gives any big dady build a run for their $. The relive AMD runs is silky smooth to record with you dont notice it live broadcasting or recording gameplay is silky smooth on Andrenaline drivers which are meant to be bugged coz of dx9 not working (not true). The worst thing is no 1 actually tests the original mainsteam builds of yester year which is bs & no true real test of the years of streaming & recording. I got a channel open for the builds of yester year which regarded as out dated crap uploading daily to prove a point dont need wads of $ & big buget build to record gameplay & stream with it effecting preformance or lag. been impressed with my Sandy Bridge even on online multiplayer games with server lag isnt noticed you dont need a beast of a build to keep up with the elite group just to record gameplay or stream, i have proven it with a true review of my AMD card as been no true real honest tests. half the bench test Toms & others carried out is heavily flawed. The best way to test is real world open game testing for hrs on end throught out 10 titles even on older builds to give a fairer representation of results. Hence why i started a channel for the not so mainstream old builds to prove that you dont need a new gen build to keep up & be able to game smoothly & always happy to allow others to share there low end build gameplay on my channel to represent more options. To any of the Toms moderators & tester you need for a fairer test stop doing tho BS benchmarks they are not true to the number you need to test between all older mainstream builds to new for the fairest guides & representation as Any online Benchmark is flawed when your only testing what alot cant afford.
  • beavermml
    can we use Intel iGPU to offload the task of streaming? or it cannot be done?
  • PaulAlcorn
    45049 said:
    On the test setup page why does the Z370 have (4X8GB)32GB while the AM4 only (2x8GB)16GB? Ive seen compression tests with RAM capacity causing performance differances.


    Good eye, thanks for the headsup Elbert! That is a typo, which I have now corrected. We tested with 16GB for both machines.