Gaming And Streaming: Which CPU Is Best For Both?

Final Analysis

AMD democratized access to high core counts with an attractively-priced Ryzen portfolio. In a way, we can thank the company for Intel's newfound interest in competing on a $/core basis. Just look at the difference in our results from Kaby Lake to Coffee Lake. As a result, high-quality software encoding on a gaming PC is becoming more realistic to mainstream gamers.

Our testing is indicative of general performance trends. Given enough time and energy, you could almost certainly improve upon our results. Part of that is by design: we're using these settings to compare large groups of processors against one another on a level playing field, as opposed to wringing the most performance out of any one processor. While we can't use our benchmarks to make definitive statements about the possibilities with each chip, we can draw some fair conclusions about how certain architectures behave.

Encoding is a parallelizable workload. If software encoding is your primary goal, you'll definitely want to seek out CPUs with lots of cores and simultaneous multi-threading capabilities. Intel's quad-core Kaby Lake models illustrate how chips that once offered class-leading gaming performance can fall apart during streaming. You can boost their performance by using less intensive quality presets, lowering the streaming frame rate, or sacrificing some quality with GPU acceleration. However, competing processors offer much more performance than Kaby Lake at the same presets and roughly the same price.

Many streamers place video quality over maximizing the frame rate of whatever game they're playing, so your own priorities will largely dictate how you tune your system. In fact, turning on v-sync may be a good way to balance streaming and gaming performance.If you seek the highest in-game performance while you stream, Intel's Coffee Lake-based Core i7-8700K is a good fit. The Ryzen 7 1800X is also competitive and tends to offer better streaming performance. Using our settings, the 1800X also had more CPU headroom leftover for more taxing encode settings, if desired. Granted, some of that extra horsepower is due to the 1800X's lower gaming performance, which means there are fewer frames to encode.

Two extra cores on the Coffee Lake-based Core i5 certainly help its standing, but the lack of Hyper-Threading has a definite impact on streaming performance. In the end, a six-core Core i5-8600K is forced to battle the 12-thread Ryzen 5 1600X, which offers a more balanced profile. Overclocking does help Intel somewhat. It can't overcome the advantage AMD gets from a more thread-heavy architecture, but it shrinks the gap somewhat in streaming workloads.

If you're really serious about streaming and gaming at the same time, the highest-end desktop CPUs are an option. Just expect to pay dearly for them. Most enthusiasts are better served by mainstream processors. Intel's Core i9 models generally provide better performance than the Threadripper 1950X, but they cost more, too. The 1950X is a solid value choice that also offers a diverse range of capabilities.

There are plenty of other solid options for gaming/streaming, and this introductory round of tests only focused on high-end models from each family. We'll expand our testing to locked SKUs as we work through coming CPU reviews.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

This thread is closed for comments
26 comments
    Your comment
  • SonnyNXiong
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.
  • InvalidError
    This benchmark will need a do-over once the patch for Intel's critical ring-0 exploit comes out and slows all of Intel's CPUs from the past ~10 years by 5-30%.
  • mcconkeymike
    Sonny, I disagree. I personally run the 7700k at 4.9ghz, but the 8700k 6 core/12 thread is a badass and does what the 7700k does and better. Please do a little research on this topic otherwise you'll end up looking foolish.
  • AndrewJacksonZA
    125865 said:
    This benchmark will need a do-over once the patch for Intel's critical ring-0 exploit comes out and slows all of Intel's CPUs from the past ~10 years by 5-30%.

    Yeah. It'll be interesting to see the impact on Windows. Phoronix ( https://www.phoronix.com/scan.php?page=article&item=linux-415-x86pti&num=2 ) shows a heavy hit in Linux for some tasks, but apparently almost no hit in gaming ( https://www.phoronix.com/scan.php?page=news_item&px=x86-PTI-Initial-Gaming-Tests )

    Although gaming in Linux, seriously?! C'mon! ;-)
  • ArchitSahu
    2523846 said:
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.


    What about the 8700k? i9? 7820X?
  • AgentLozen
    What a great article. It really highlights the advantage of having more cores. If you're strictly into gaming without streaming, then Kaby Lake (and Skylake by extension) is still an awesome choice. It was really interesting to see it fall apart when streaming was added to the formula. I wasn't expecting it to do so poorly even in an overclocked setting.
  • Soda-88
    A couple of complaints:
    1) the article is 10 months late
    2) the game choice is poor, you should've picked titles that are popular on twitch

    Viewer count at present moment:
    BF1: ~1.300
    GTA V: ~17.000
    ME - SoW: ~600

    Personally, I'd like to see Overwatch over BF1 and PUBG over ME: SoW.
    GTA V is fine since it's rather popular, despite being notoriously Intel favoured.

    Other than that, a great article with solid methodology.
  • guadalajara296
    I do a lot of video encoding / rendering in Adobe cc premier pro
    It takes 2 hours to render a video on Skylake cpu. would a Ryzen multi core improve that by 50% ?
  • salgado18
    1632944 said:
    2523846 said:
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.
    What about the 8700k? i9? 7820X?


    Please, don't feed the trolls. Thanks.
  • lsatenstein
    To be able to respond to Paul's opening comment about reputability testing, the rule is to have at least 19 test runs. The 19 runs will provide a 5 percent confidence interval. That means or could be understood to be 19/20 the results will be within 5 percent of the mean, which is about 1 standard deviation.
  • lsatenstein
    The new flaw discovered with older existing CPUS does not affect your testing above, because your cpus are post flaw detection. But for the average person on the street (me), who does have older Intel CPUs, I just lost between 10% and 20% cpu performance, be it windows or Linux.


    Security
    'Kernel memory leaking' Intel processor design flaw forces Linux, Windows redesign
    Google the title for more info.
  • InvalidError
    702528 said:
    The new flaw discovered with older existing CPUS does not affect your testing above, because your cpus are post flaw detection.

    None of the sources I have found exempt any Intel CPUs from the flaw which appears to affect all of Intel's x86-64 CPUs up to current. It takes several months of R&D past detection to design and manufacture a new chip with the fixes baked-in, which means that unless Intel knew about the flaw 4+ months ahead of Coffee Lake's launch, then all current Intel chips have the flaw.
  • razamatraz
    Wasn't the Ring-0 bug described found in 2015? At the time there was no demonstration that it actually could be abused if I recall correctly but Intel should have taken it seriously when designing Coffee Lake CPUs...not sure if they did; the list of affected CPUs is still under NDA. In any case it seems to have much more effect on machines running VMs, multiple simultaneous users etc than on typical single user gaming machines as you'd have to already be compromised for this bug to have any impact on a single user machine...IE they'd have to get malware on your machine in the first place. This bug doesn't help anyone do that.

    Waiting on actual answers, expected in the next week or two.
  • InvalidError
    1579424 said:
    Wasn't the Ring-0 bug described found in 2015?

    None of the stories I've read about it so far make any mention of the actual date where Intel was made aware of the bug. If Microsoft had a patch ready for such a severe bug (unrestricted access to kernel space from user land) less than two months ago, I'd be surprised if Intel has been sitting on it for over a year: that would be a huge liability for Intel if a company got hacked and found out Intel knew about the vulnerability that enabled the hack long beforehand.
  • Honis
    I like how this shows how great multi-threaded performance can be with multiple proceses running in the background.

    I know when I game, I may not stream but I usually have a browser open and a few sites I frequent have junky autoplay videos. My Ryzen 7 and I would think an i7-8 series, handles these intermittent hiccups nicer than the 4 core processors.
  • Nintendork
    @guadalajara296
    Consider changing to sony vegas, Adobe suite is pretty shi*tty regarding multicore optimization.
  • Yuka
    Thanks for this review. It really does bring nice information to the table.

    Are you going to test with more forgiving settings? I can stream decently with 720p and 30FPS using my i7 2700K (no hard figures though), pretty much every single game out there. And will you guys update the numbers once the security patches are in place? I'd love to see how that affects Streaming.

    Cheers!
  • cryoburner
    512760 said:
    the game choice is poor, you should've picked titles that are popular on twitch... ...Personally, I'd like to see Overwatch over BF1 and PUBG over ME: SoW.

    The problem with games like Overwatch and Battlegrounds is that it would be difficult to get meaningful performance numbers out of them. Those games can only be played online, and there's no real way to control what other players are doing during a match. That's why they pick games where their tests can be performed in single player mode, where there can be relatively little variance between one benchmark run and the next.
  • DerekA_C
    @Cryoburner, the fact is most streamers don't stream single player games they stream multiplayer, something with actual competition, but I do get why they tested it that way. However, it isn't very realistic to stream only single player it has to demonstrate those variances.
  • James Mason
    I wish this had a "dynamic" Price to Performance metric at the end that could compare to the prices you pull from amazon/newegg to how good the CPUs (or GPUs/whatever) to provide an always relevant way to decide what's the best choice for whoever and whenever sees the article.
  • elbert
    On the test setup page why does the Z370 have (4X8GB)32GB while the AM4 only (2x8GB)16GB? Ive seen compression tests with RAM capacity causing performance differances.
  • Plumboby
    Um streaming & recording my trusty old i52400 with 8gig ram & Sapphire pulse RX560 4G is prety impressive for the age of build & the Videocard gives any big dady build a run for their $. The relive AMD runs is silky smooth to record with you dont notice it live broadcasting or recording gameplay is silky smooth on Andrenaline drivers which are meant to be bugged coz of dx9 not working (not true). The worst thing is no 1 actually tests the original mainsteam builds of yester year which is bs & no true real test of the years of streaming & recording. I got a channel open for the builds of yester year which regarded as out dated crap uploading daily to prove a point dont need wads of $ & big buget build to record gameplay & stream with it effecting preformance or lag. been impressed with my Sandy Bridge even on online multiplayer games with server lag isnt noticed you dont need a beast of a build to keep up with the elite group just to record gameplay or stream, i have proven it with a true review of my AMD card as been no true real honest tests. half the bench test Toms & others carried out is heavily flawed. The best way to test is real world open game testing for hrs on end throught out 10 titles even on older builds to give a fairer representation of results. Hence why i started a channel for the not so mainstream old builds to prove that you dont need a new gen build to keep up & be able to game smoothly & always happy to allow others to share there low end build gameplay on my channel to represent more options. To any of the Toms moderators & tester you need for a fairer test stop doing tho BS benchmarks they are not true to the number you need to test between all older mainstream builds to new for the fairest guides & representation as Any online Benchmark is flawed when your only testing what alot cant afford.
  • beavermml
    can we use Intel iGPU to offload the task of streaming? or it cannot be done?
  • PaulAlcorn
    45049 said:
    On the test setup page why does the Z370 have (4X8GB)32GB while the AM4 only (2x8GB)16GB? Ive seen compression tests with RAM capacity causing performance differances.


    Good eye, thanks for the headsup Elbert! That is a typo, which I have now corrected. We tested with 16GB for both machines.