4K stutters on computer with new 4K TV

I am having a strange issue with my computer at work... it is a video editing workstation and, up to now, we have had it projecting 720p on an Epson Projector while it's main display is 1920x1200. We bought new 4K cameras and bought a 4K TV to view our 4k files. The only problem is when I try to play 4K content, the video "buffers" and lags constantly, unlike 1080p footage. I tried it on a higher end editing bay and it worked fine. What's causing this? GPU? CPU? Here are the specs:

HP z230
Xeon e3-1225 v3 (3.20-3.60 GHz)
16 GB RAM
SSD
Quadro K600 with 3GB DDR3

I should also mention that I am only able to display 1080p on the 4K tv (upscaled) with the cable I have (I am ordering a 4K capable one soon), but shouldn't it still be able to play a 4K video when not on a 4K display? I tried it on the other computer on a 1080p display and it seemed to play fine. Thanks in advance for the help.
Reply to alexb870
6 answers Last reply
More about stutters computer
  1. 1. TVs make terrible monitors

    2. That card is extremely weak for K and while it made great workstation in its day the hardware is jut not optimized for that you want to do.

    3. What are the files .... frame rates ?
    Reply to JackNaylorPE
  2. Depends on the codec the video is compressed with and if the GPU or CPU have those decoding abilities. Building in H.265 4K decoding and a chip can do it with next to no processing. Using CPU based decoding can drag a beast of a system to it's knee's that doesn't have built in hardware decoding for it. Also, running it at 1080p is forcing it possibly software decode the video, and then downscale it 4x to 1080p.
    Reply to getochkn
  3. getochkn said:
    Depends on the codec the video is compressed with and if the GPU or CPU have those decoding abilities. Building in H.265 4K decoding and a chip can do it with next to no processing. Using CPU based decoding can drag a beast of a system to it's knee's that doesn't have built in hardware decoding for it. Also, running it at 1080p is forcing it possibly software decode the video, and then downscale it 4x to 1080p.


    It is MP4 H.264 60fps footage. Both downloaded and 4K YouTube videos stutter. I am running 2 displays, the TV is not a monitor, we are using it to watch our content. The other display is an old Dell using DVI. Both are set to 1080p for the time being. What I can't understand is that there are articles about Haswell i3 processors playing 4k onboard without issue so why is a Xeon with dedicated graphics struggling?
    Reply to alexb870
  4. alexb870 said:
    getochkn said:
    Depends on the codec the video is compressed with and if the GPU or CPU have those decoding abilities. Building in H.265 4K decoding and a chip can do it with next to no processing. Using CPU based decoding can drag a beast of a system to it's knee's that doesn't have built in hardware decoding for it. Also, running it at 1080p is forcing it possibly software decode the video, and then downscale it 4x to 1080p.


    It is MP4 H.264 60fps footage. Both downloaded and 4K YouTube videos stutter. I am running 2 displays, the TV is not a monitor, we are using it to watch our content. The other display is an old Dell using DVI. Both are set to 1080p for the time being. What I can't understand is that there are articles about Haswell i3 processors playing 4k onboard without issue so why is a Xeon with dedicated graphics struggling?


    That comes down to if hardware decoding is built into the Xeon on not. Software decoding of 264/265 stresses CPU too much. A $40 android box can decode 4K video IF a dedicated hardware decoder is built into it. Otherwise, impossible.
    Reply to getochkn
  5. getochkn said:
    alexb870 said:
    getochkn said:
    Depends on the codec the video is compressed with and if the GPU or CPU have those decoding abilities. Building in H.265 4K decoding and a chip can do it with next to no processing. Using CPU based decoding can drag a beast of a system to it's knee's that doesn't have built in hardware decoding for it. Also, running it at 1080p is forcing it possibly software decode the video, and then downscale it 4x to 1080p.


    It is MP4 H.264 60fps footage. Both downloaded and 4K YouTube videos stutter. I am running 2 displays, the TV is not a monitor, we are using it to watch our content. The other display is an old Dell using DVI. Both are set to 1080p for the time being. What I can't understand is that there are articles about Haswell i3 processors playing 4k onboard without issue so why is a Xeon with dedicated graphics struggling?


    That comes down to if hardware decoding is built into the Xeon on not. Software decoding of 264/265 stresses CPU too much. A $40 android box can decode 4K video IF a dedicated hardware decoder is built into it. Otherwise, impossible.


    How do I determine if it has hardware decoding built in? The processor is made for video editing so I would think it would.
    Reply to alexb870
  6. getochkn said:
    alexb870 said:
    getochkn said:
    Depends on the codec the video is compressed with and if the GPU or CPU have those decoding abilities. Building in H.265 4K decoding and a chip can do it with next to no processing. Using CPU based decoding can drag a beast of a system to it's knee's that doesn't have built in hardware decoding for it. Also, running it at 1080p is forcing it possibly software decode the video, and then downscale it 4x to 1080p.


    It is MP4 H.264 60fps footage. Both downloaded and 4K YouTube videos stutter. I am running 2 displays, the TV is not a monitor, we are using it to watch our content. The other display is an old Dell using DVI. Both are set to 1080p for the time being. What I can't understand is that there are articles about Haswell i3 processors playing 4k onboard without issue so why is a Xeon with dedicated graphics struggling?


    That comes down to if hardware decoding is built into the Xeon on not. Software decoding of 264/265 stresses CPU too much. A $40 android box can decode 4K video IF a dedicated hardware decoder is built into it. Otherwise, impossible.


    I'm not sure about the CPU, but I determined that the GPU doesn't support H.265 4K 4:4:4 or 4:4:0 hardware decoding, but the k620 does... could this solve the problem?
    Reply to alexb870
Ask a new question Answer

Read More

Screen Resolution TV Resolution Computers 4K Graphics GPUs Graphics Cards