Gaming And Streaming: Which CPU Is Best For Both?

Intel Core i5 & AMD Ryzen 5

We generated our stream using the "veryfast" encoder preset for this class of processors.

Loading...

Battlefield 1

First, we run a set of baseline tests to gauge performance without an active YouTube stream. Intel's Core i5-8600K fares best, but the Core i5-7600K and Ryzen 5 1600X aren't far behind.

The story changes when we add streaming to the mix. All three CPUs lose varying amounts of performance, with Ryzen 5 1600X ending up on top (though not by much).

We have two entries in our charts for the Kaby Lake-based Core i5-7600K: normal and high-priority. First, let's talk about the normal results. Intel's 4C/4T architecture hamstrings the Core i5-7600K, and it loses nearly half of its average frame rate. Unsurprisingly, this has an impact on in-game frame time variance and unevenness measurements. Those numbers don't tell the entire story, though. Battlefield 1 is basically unplayable due to hitching and stuttering during our stream.

The stream quality chart shows that Core i5-7600K only delivers 23.6% of its frames during the test, which means it drops an astounding 76.4% of the frames. This also results in a completely unwatchable stream. Flip over to the CPU utilization charts; they tell a profound story. In short, Core i5-7600K doesn't have any headroom available to handle the parallelized encoding workload throughout most of our benchmark.

The OBS software has several different settings to optimize performance, and adjusting process priority is a common tactic to improve streaming on Kaby Lake-based processors. So, we selected the high-priority setting (marked HP Stream in the charts). Effectively, this allows the encoder to steal cycles from the game engine, resulting in lower frame rates. But the encoder successfully processes 100% of the frames. Our Core i5-7600K fell to 51.6 FPS, and its 99th percentile measurements nosedived. On the flip side, less performance means fewer frames to encode per second, which in turn boosts encoding efficiency. Surprisingly, 91.05% of the frames landed within the desirable 16.667ms range.

We could always dial-back the encoder preset to accommodate Intel's lackluster -7600K, but similarly-priced processors in the test pool handle these quality settings adeptly. While streaming, Core i5-8600K provides nearly the same in-game average frame rate as AMD's Ryzen 5 1600X and it encodes 100% of the frames. The -8600K's streaming quality isn't as impressive as Ryzen's: 89.44% of frames land within the desirable range and ~10% of frames fall either above or below the threshold. In our opinion, the stream looked fine, but wasn't as smooth as the Ryzen 5 1600X's output.

A glance at the 99th percentile FPS chart tells us that Ryzen 5 1600X delivers the smoothest gaming experience during our streaming session. It also has plenty of reserves left in the tank, even during the streaming workload. In other words, you could probably increase the encoding preset to a higher level than the competing processors.

AMD's Ryzen CPUs don't overclock as well as Core processors, though. We tuned each model to see if the Intel's available headroom turned the tables. The x264 encoder uses AVX instructions extensively, so it serves as a nice stress test for overclocking stability. No doubt, Coffee Lake's AVX offset could come in handy, though it might also result in lower frequencies during streaming if you activate the feature (or leave it on Auto in the UEFI).

Overclocking nudges the Core i5-7600K ahead of Ryzen 5 1600X during our baseline gaming test, but it does little to rectify the issues encountered during our streaming workload. Four threads just can't hack it. Streaming suffers even after adjusting the priority status. The -7600K does serve up a higher percentage of frames with the overclocked settings and normal prioritization, but it still costs the chip game performance. Priority adjustments have little bearing on gaming smoothness: the -7600K's 99th percentile measurements are terrible with both settings.

A 4.9 GHz overclock does boost the Coffee Lake-based Core i5-8600K's 99th percentile measurements. However, it still trails Ryzen 5 1600X. The tuned -8600K is certainly competitive thanks to a compelling gaming experience, but it can't match the Ryzen 5 1600X's streaming performance. It's apparent the encoder appreciates lots of threads, so Ryzen's 12 are a big advantage.

The frame time chart summarizes in-game performance well. Kaby Lake-based processors scribble their way across with extreme variance. The Core i5-8600K is also easy to spot during our streaming workload. Meanwhile, Ryzen 5 1600X provides a much more consistent in-game experience.

Grand Theft Auto V

Grand Theft Auto V scales well with additional host processing resources and tends to favor Intel architectures. It isn't surprising to see the Core CPUs lead during our baseline tests. But we weren't expecting the Core i5-8600K to maintain its advantage over the competition while streaming.

The Core i5-8600K delivers 100% of its frames during the test, though only 89.71% fall into the perfect 60 FPS bucket. We see a nearly even split of 5% above and below that mark. Visually, the stream looks fine. The -8600K takes more of a hit in the 99th percentile measurements, though it still leads the rest of our contenders.

AMD's Ryzen 5 1600X starts out with the lowest average frame rate during our baseline and falls to 60.7 FPS during the stream. It doesn't suffer too badly in 99th percentile measurements. Meanwhile, the chip provides stellar streaming performance, dropping no frames and almost landing at a smooth 60 FPS.

The Core i5-7600K provides a decent gaming experience during normal streaming, churning out 79.2 FPS. That's because it really isn't encoding frames, though—it drops 84.9% of them. The stream is unwatchable by any measurement. Again, we can boost the -7600K's encoding performance to 100% by specifying high priority, but then the gaming experience is terrible. In fact, large portions of the scene simply do not render correctly.

Overclocking the Core i5-7600K doesn't help; it's still entirely saturated during the streaming test. But tuning does benefit Intel's Core i5-8600K, which leads the pack in both average and 99th percentile metrics. Its streaming quality is decent as well.

The Ryzen 5 1600X's performance improves after overclocking, too. It doesn't provide the absolute best frame rate while streaming, but it is perfectly playable. Also, Ryzen 5 1600X still offers the smoothest stream, if only just barely.

Middle-earth: Shadow of War

Intel's Core i5-7600K continues to disappoint. It serves up an average of 51.4 FPS while streaming (using the high priority setting) and encoding 100% of the frames. Unfortunately, game smoothness goes out the window again. The 15.9 FPS 99th percentile measurement tells us everything we need to know.

The Ryzen 5 1600X offers decent gaming performance while streaming, but that Coffee Lake-based -8600K continues leading.

AMD's Ryzen 5 1600X offers the best of both worlds, though Core i5-8600K is also very capable. Perhaps tuning the encode could get more performance from the Intel chip. Then again, you could say the same thing about Ryzen 5, which offers the best mix of streaming and frame rates in its class. It's obvious from these results that a Kaby Lake-based Core i5 owner needs to dial back streaming settings dramatically or turn to faster hardware.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

This thread is closed for comments
26 comments
    Your comment
  • SonnyNXiong
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.
  • InvalidError
    This benchmark will need a do-over once the patch for Intel's critical ring-0 exploit comes out and slows all of Intel's CPUs from the past ~10 years by 5-30%.
  • mcconkeymike
    Sonny, I disagree. I personally run the 7700k at 4.9ghz, but the 8700k 6 core/12 thread is a badass and does what the 7700k does and better. Please do a little research on this topic otherwise you'll end up looking foolish.
  • AndrewJacksonZA
    125865 said:
    This benchmark will need a do-over once the patch for Intel's critical ring-0 exploit comes out and slows all of Intel's CPUs from the past ~10 years by 5-30%.

    Yeah. It'll be interesting to see the impact on Windows. Phoronix ( https://www.phoronix.com/scan.php?page=article&item=linux-415-x86pti&num=2 ) shows a heavy hit in Linux for some tasks, but apparently almost no hit in gaming ( https://www.phoronix.com/scan.php?page=news_item&px=x86-PTI-Initial-Gaming-Tests )

    Although gaming in Linux, seriously?! C'mon! ;-)
  • ArchitSahu
    2523846 said:
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.


    What about the 8700k? i9? 7820X?
  • AgentLozen
    What a great article. It really highlights the advantage of having more cores. If you're strictly into gaming without streaming, then Kaby Lake (and Skylake by extension) is still an awesome choice. It was really interesting to see it fall apart when streaming was added to the formula. I wasn't expecting it to do so poorly even in an overclocked setting.
  • Soda-88
    A couple of complaints:
    1) the article is 10 months late
    2) the game choice is poor, you should've picked titles that are popular on twitch

    Viewer count at present moment:
    BF1: ~1.300
    GTA V: ~17.000
    ME - SoW: ~600

    Personally, I'd like to see Overwatch over BF1 and PUBG over ME: SoW.
    GTA V is fine since it's rather popular, despite being notoriously Intel favoured.

    Other than that, a great article with solid methodology.
  • guadalajara296
    I do a lot of video encoding / rendering in Adobe cc premier pro
    It takes 2 hours to render a video on Skylake cpu. would a Ryzen multi core improve that by 50% ?
  • salgado18
    1632944 said:
    2523846 said:
    Core i7 7700k is the best, can't get any better than that except for a high OC speed and high core clock speed on that core i7 7700k.
    What about the 8700k? i9? 7820X?


    Please, don't feed the trolls. Thanks.
  • lsatenstein
    To be able to respond to Paul's opening comment about reputability testing, the rule is to have at least 19 test runs. The 19 runs will provide a 5 percent confidence interval. That means or could be understood to be 19/20 the results will be within 5 percent of the mean, which is about 1 standard deviation.
  • lsatenstein
    The new flaw discovered with older existing CPUS does not affect your testing above, because your cpus are post flaw detection. But for the average person on the street (me), who does have older Intel CPUs, I just lost between 10% and 20% cpu performance, be it windows or Linux.


    Security
    'Kernel memory leaking' Intel processor design flaw forces Linux, Windows redesign
    Google the title for more info.
  • InvalidError
    702528 said:
    The new flaw discovered with older existing CPUS does not affect your testing above, because your cpus are post flaw detection.

    None of the sources I have found exempt any Intel CPUs from the flaw which appears to affect all of Intel's x86-64 CPUs up to current. It takes several months of R&D past detection to design and manufacture a new chip with the fixes baked-in, which means that unless Intel knew about the flaw 4+ months ahead of Coffee Lake's launch, then all current Intel chips have the flaw.
  • razamatraz
    Wasn't the Ring-0 bug described found in 2015? At the time there was no demonstration that it actually could be abused if I recall correctly but Intel should have taken it seriously when designing Coffee Lake CPUs...not sure if they did; the list of affected CPUs is still under NDA. In any case it seems to have much more effect on machines running VMs, multiple simultaneous users etc than on typical single user gaming machines as you'd have to already be compromised for this bug to have any impact on a single user machine...IE they'd have to get malware on your machine in the first place. This bug doesn't help anyone do that.

    Waiting on actual answers, expected in the next week or two.
  • InvalidError
    1579424 said:
    Wasn't the Ring-0 bug described found in 2015?

    None of the stories I've read about it so far make any mention of the actual date where Intel was made aware of the bug. If Microsoft had a patch ready for such a severe bug (unrestricted access to kernel space from user land) less than two months ago, I'd be surprised if Intel has been sitting on it for over a year: that would be a huge liability for Intel if a company got hacked and found out Intel knew about the vulnerability that enabled the hack long beforehand.
  • Honis
    I like how this shows how great multi-threaded performance can be with multiple proceses running in the background.

    I know when I game, I may not stream but I usually have a browser open and a few sites I frequent have junky autoplay videos. My Ryzen 7 and I would think an i7-8 series, handles these intermittent hiccups nicer than the 4 core processors.
  • Nintendork
    @guadalajara296
    Consider changing to sony vegas, Adobe suite is pretty shi*tty regarding multicore optimization.
  • Yuka
    Thanks for this review. It really does bring nice information to the table.

    Are you going to test with more forgiving settings? I can stream decently with 720p and 30FPS using my i7 2700K (no hard figures though), pretty much every single game out there. And will you guys update the numbers once the security patches are in place? I'd love to see how that affects Streaming.

    Cheers!
  • cryoburner
    512760 said:
    the game choice is poor, you should've picked titles that are popular on twitch... ...Personally, I'd like to see Overwatch over BF1 and PUBG over ME: SoW.

    The problem with games like Overwatch and Battlegrounds is that it would be difficult to get meaningful performance numbers out of them. Those games can only be played online, and there's no real way to control what other players are doing during a match. That's why they pick games where their tests can be performed in single player mode, where there can be relatively little variance between one benchmark run and the next.
  • DerekA_C
    @Cryoburner, the fact is most streamers don't stream single player games they stream multiplayer, something with actual competition, but I do get why they tested it that way. However, it isn't very realistic to stream only single player it has to demonstrate those variances.
  • James Mason
    I wish this had a "dynamic" Price to Performance metric at the end that could compare to the prices you pull from amazon/newegg to how good the CPUs (or GPUs/whatever) to provide an always relevant way to decide what's the best choice for whoever and whenever sees the article.
  • elbert
    On the test setup page why does the Z370 have (4X8GB)32GB while the AM4 only (2x8GB)16GB? Ive seen compression tests with RAM capacity causing performance differances.
  • Plumboby
    Um streaming & recording my trusty old i52400 with 8gig ram & Sapphire pulse RX560 4G is prety impressive for the age of build & the Videocard gives any big dady build a run for their $. The relive AMD runs is silky smooth to record with you dont notice it live broadcasting or recording gameplay is silky smooth on Andrenaline drivers which are meant to be bugged coz of dx9 not working (not true). The worst thing is no 1 actually tests the original mainsteam builds of yester year which is bs & no true real test of the years of streaming & recording. I got a channel open for the builds of yester year which regarded as out dated crap uploading daily to prove a point dont need wads of $ & big buget build to record gameplay & stream with it effecting preformance or lag. been impressed with my Sandy Bridge even on online multiplayer games with server lag isnt noticed you dont need a beast of a build to keep up with the elite group just to record gameplay or stream, i have proven it with a true review of my AMD card as been no true real honest tests. half the bench test Toms & others carried out is heavily flawed. The best way to test is real world open game testing for hrs on end throught out 10 titles even on older builds to give a fairer representation of results. Hence why i started a channel for the not so mainstream old builds to prove that you dont need a new gen build to keep up & be able to game smoothly & always happy to allow others to share there low end build gameplay on my channel to represent more options. To any of the Toms moderators & tester you need for a fairer test stop doing tho BS benchmarks they are not true to the number you need to test between all older mainstream builds to new for the fairest guides & representation as Any online Benchmark is flawed when your only testing what alot cant afford.
  • beavermml
    can we use Intel iGPU to offload the task of streaming? or it cannot be done?
  • PaulAlcorn
    45049 said:
    On the test setup page why does the Z370 have (4X8GB)32GB while the AM4 only (2x8GB)16GB? Ive seen compression tests with RAM capacity causing performance differances.


    Good eye, thanks for the headsup Elbert! That is a typo, which I have now corrected. We tested with 16GB for both machines.