Dell S2718D HDR Monitor Review

HDR Measurements & Hands-On

To test the S2718D’s HDR feature, we had to add a new device to our calibration toolkit: the HD Fury Integral. It’s a small box that modifies a 1080p SDR signal by adding the appropriate HDR10 metadata. We connected it between our Accupel pattern generator and the display. We also used a special CalMAN workflow designed specifically for HDR10 calibrations. It measures the proper EOTF luminance tracking function, as well as Rec.2020, DCI-P3, and Rec.709 color.

When it comes to the S2718D it’s important to note two things: it must be switched manually into one of its two HDR picture modes before sending the signal; and there are no image adjustments available in either Movie HDR or Game HDR. What you see is what you get. It also engages dynamic contrast even though the OSD option is grayed out. We could see brightness pumping when switching quickly between patterns of varying brightness. There is also visible edge enhancement that cannot be defeated with the sharpness slider. Luckily, color and luminance are fairly accurate. Let’s check out the pertinent graphs.

First up is grayscale tracking. It’s laid out similarly to the graphs we use to measure SDR, only the percent values have been replaced with code values, and there are 16 measurement points. EOTF is the most important aspect of HDR rendering, because luminance is what creates the impression of greater contrast. If the display’s levels don’t match the signal’s, the whole effect collapses. You can see the grayscale stays close to D65 until around code value 500, roughly midway up the brightness scale. At that point, blue begins to clip and a warm tone begins to set in until you get to maximum brightness.

The EOTF chart shows the clip more clearly. It starts to roll off around CV500 and is fully-realized at CV700. What this means is that any signals above 700 will display at the same brightness level. How that affects content depends on how much dynamic range was used during the encoding process. If the S2718D had left its contrast control unlocked, it could be lowered to compensate for this, though bringing out that extra detail would make the picture somewhat flat and murky. The upside is that below the clipping point, it follows the EOTF very closely.

Moving on to color, we ran three sweeps covering Rec.2020, DCI-P3, and Rec.709. Current Ultra HD Blu-ray discs are mastered in the DCI-P3 gamut. The S2718D tracks color saturation well until it hits the limits imposed by its native primaries. This is how an HDR display should look. It manages to nail every target until it simply runs out of color. Though the highest saturations won’t appear, everything below that will be rendered correctly with maximum detail.

When watching HDR content on the S2718D, one should remember that it is still an IPS panel with a native contrast ratio below 1000:1. It won’t suddenly look like an OLED panel in HDR mode. But that judgement must be made while viewing actual content, which is exactly what we did next.

Playing HDR Content On The S2718D

We used the Philips BDP-7501 Ultra HD Blu-ray player to spin a few UHD discs. First off, it’s not quite a plug-and-play experience. You must engage your settings in the right order to make it all work, but it’s simple. After powering up the player, select the S2718D’s Movie HDR picture mode before inserting a disc. This will ensure proper handshake through the signal path to engage HDR10 decoding. In our case, the player’s on-screen messages confirmed we were in HDR mode. If you load the disc before changing picture modes, the player won’t switch the monitor to HDR mode and you’ll need to eject and start again.

To assess the difference in quality, we watched JJ Abrams’ Star Trek and The Martian in both Ultra HD and standard Blu-ray. The S2718D happily accepts Ultra HD signals at full 3840x2160 resolution and down-converts them internally with no apparent artifacts. Watching the same scenes from both standard and UHD discs showed subtle differences in color and dynamic range. The HDR version looks a bit more saturated, and there is clearer detail in both highlight and shadow areas. The dynamic contrast feature doesn’t cause the visible issues we saw during our pattern tests. And the added edge enhancement seems subtle enough not to degrade clarity.

MORE: Best Gaming Monitors

MORE: Best Professional Monitors

MORE: How We Test Monitors

MORE: How To Choose A Monitor

MORE: All Monitor Content

This thread is closed for comments
24 comments
    Your comment
  • WINTERLORD
    cant wait till they have 4k in hdr at affordable prices
  • daglesj
    I'll take the features but ditch the stand for a standard removable with VESA and the controls built in as normal.
  • GentlemanGreen
    stopped reading at 60 hz
  • LionD
    How could 8 bit/sRGB display, with contrast 1000 and no local dimming, deliver true HDR experience? Total nonsense.
  • CarbonBased
    @GENTLEMANGREEN

    Lots of people have plenty of use for 60hz screens. stop poo-pooing products that cleary arent aimed at you. I have a rig for gaming, and sure, 60hz isn't really enough anymore. However, I take and edit photos as a hobby, so IPS, 10-bit, HDR, all very attractive features. Add that i can mate it to my photo editing laptop with a USB-C cable and were really getting somewhere. I'll be looking for this one come holiday season.
  • cbliss
    NOT AN HDR MONITOR.. FALSE ADVERTISING.. BUYERS BEWARE!! (HDR requires 10bit panel, this is 8bit.. It also lacks any form of local dimming). Bogus product for hdr, otherwise simply an overpriced QHD monitor).
  • CarbonBased
    Fair enough, I didnt realize that it was 8 instead of 10 bit. But I will stand by my point that 60Hz is fine for many, if not most, computer users, even if they are gamers. the market for high refresh rates is specifically gamer-centric. Dissing product thats arent built to gamer spec because you are a gamer does not lend one to being an unbiased source of opinion.
  • Scott____67
    i like to wall mount my monitor anyways so the stand is non existent and in a condo it keeps areas and desk spaces clear plus having a little height with a downward pitch is perfect for the lean back in the chair gamer that i am
  • alextheblue
    2460450 said:
    Fair enough, I didnt realize that it was 8 instead of 10 bit. But I will stand by my point that 60Hz is fine for many, if not most, computer users, even if they are gamers. the market for high refresh rates is specifically gamer-centric. Dissing product thats arent built to gamer spec because you are a gamer does not lend one to being an unbiased source of opinion.

    Agreed. A 60hz monitor isn't great for gaming anymore, so for my personal needs and budget I'm better off with a halfway decent TN panel with high refresh rate, wide freesync range, and low input lag. That might change in the future, as advanced displays come down in price. But today that's what best fits my needs.

    But as you said most non-gaming applications don't need high refresh rates. Users who don't game will typically favor resolution, contrast, brightness, viewing angle, and color reproduction over refresh rate and input latency. If you have a sub-$300 budget like I do you often end up with a display that either favors gaming performance and features, or image quality and advanced colorspaces. Just because you favor a high-refresh gaming monitor doesn't mean you can't recognize uses for a non-gaming display.

    Granted if you spend enough money you can get a display that doesn't compromise much and is fairly good at everything. Way out of my price range at this point, though.
  • alextheblue
    Quote:
    To see HDR content, you’ll need a compatible player or computer with an HDMI 2.0/HDCP 2.2 output. The latest Ultra HD Blu-ray players feature this interface. You can also connect with the right video card. Fortunately, there are quite a few choices. On the Nvidia side is the GTX 950 up to the Titan X (Maxwell), or the GTX 1050 to Titan X (Pascal). AMD users can employ an R9 390X or RX 460, 470, or 480.

    I thought anything with Polaris would support HDR10, such as Radeon 540/550 (Polaris 12). Maybe I'm misremembering. Also, on PC you have to have to use HDR10 compatible playback software to benefit.

    On the console side of things, Xbox One S has supported Ultra HD (4K HDR10) BDs for some time. If I was looking for a dedicated box, it's a good choice even if you don't play console games. It's not much more than a decent dedicated 4K HDR10 player, and it has better support for apps. You can add a Kinect if you want voice control. If you don't use physical discs but want a dedicated box for 4K HDR streams, then I'd recommend a Roku Premiere+ or Ultra.
  • rantoc
    This monitor is as much HDR as the consoles are 4k capable ;)
  • i-am-i-u-r-u
    Way over priced and phony marketing as HDR. The after calibration black luminance of .2663 and contrast ratio of 761 is pitiful.
  • cryoburner
    142182 said:
    I'll take the features but ditch the stand for a standard removable with VESA and the controls built in as normal.


    I don't get why they would give this relatively high-end monitor such a mediocre non-removable stand and no VESA mounts. It might make the monitor look relatively nice when viewed from the back and side, but that's not likely to be relevant in most usage scenarios, where the back will be facing toward a wall. Making the monitor slightly thinner is largely pointless when its at the expense of functionality.

    Also, the monitor's main feature seems only half-implemented. It's supposedly an "HDR monitor", but has weak static contrast ratios. Maybe it will often look better than a standard IPS screen when fed with HDR content, but a VA panel would probably look better overall, even without support for 10-bit input. This monitor reminds me of those standard-definition televisions from a decade ago that would accept an HD signal, but then downsample it to the screen's SD display, only here, we seem to be taking a high dynamic range image and displaying it on a screen with mediocre contrast. It might be a decent monitor for what it is, but it seems a bit overpriced considering what it has to offer.
  • Nintendork
    I wouldn't touch any shi*tty monitor with poor 1000:1 contrast when 3000:1 AMVA+ are available, at least till OLED floods PC.
  • bit_user
    I feel like I've been reading about >= 10-bit and HDR for like 10 years. HDMI has supported 30-bit (10 bits per channel) and 36-bit (12 bits per channel) "deep color" for about that long, and I thought there were supposed to be monitors that supported it.
  • jn77
    My Still camera's record in 14bit Raw, Photoshop and Capture One work with 12 and 14bit files. My video camera's record in 12bit raw. I don't own a business and I am not going to spend $6000 on a computer monitor.

    240 Htz 3D 10,12 and 14 bit panels are way over priced and need to come down. The same with OLE.

    All it is is a monitor, not even a TV. It is a dead screen to display anything you throw at it.

    Look at how TV's depreciate. All it is, is a system rigged to make you pay for stuff that is already obsolete and keep consumers on the hook for upgrades.
  • shrapnel_indie
    205977 said:
    I wouldn't touch any shi*tty monitor with poor 1000:1 contrast when 3000:1 AMVA+ are available, at least till OLED floods PC.


    All manufacturers tend use different formulas to calculate contrast ratio, and frequently manufacturers change their own formulas. This kind of makes contrast ratios irrelevant, unless this practice has changed and a universal formula has been agreed upon.
  • shrapnel_indie
    Lets see...

    IPS: Good
    gtg refresh: 6ms - outside what is considered good for gaming (5ms or less)
    Framerate: 60 fps - bare minimum to look at for gaming, and that being only for the really low budget concerns.


    No, not a gaming monitor.
    No, not quite a Developer's monitor.
    Office monitor? Too pricey.
    Executive's monitor? Now I think we have something. A monitor aimed to cater to the hipster exec who wants to look good flaunting the latest tech.
  • ceberle
    I tried to be clear in the review that this monitor correctly processes HDR signals but with its low native contrast and edge backlight, it doesn't deliver an optimal HDR experience. There are plenty of televisions that offer similar performance.

    The only way an LCD panel will do justice to HDR is with a zone-dimming backlight. I've recently seen demos of upcoming screens from Asus that have this feature. They look stunning to say the least. I also have the UP2718Q from Dell that has a 384-zone backlight with 1000nits, 10-bit color, and Ultra HD. That review will appear soon.

    I realize the S2718D is an early effort. It'll only get better from here!

    Christian
  • JTWrenn
    Why would anyone buy this when a 32" 4k hdr monitor from samsung is right around the corner for $699? Uh850 seems like a much better deal.
  • 10tacle
    This is a fraudulent marketing gimmick that attempts to mimic HDR. While there are no definitive IEEE-type standards for "HDR", the fact is that real HDR panels (HDTVs as well as PC monitors) have higher caliber panels. Specifically starting with being a 10-bit panel and then on to color gamut, brightness, and contrast ratio expectations.

    While there are no official standards of HDR, there are three generally recognized standards of HDR: HDR10, Dolby Vision, and the newest is Samsung's HDR10+. HDR10 is the "official" supported one by the UHD Alliance and is being pushed to make as standard similar to Blu-Ray winning over HD DVD format. They put their stamp of approval as "Ultra HD Premium" on the products.

    This monitor fails on everything and should not even have "HDR" in the description. Fraud if you ask me and I'm very disappointed as a long time Dell panel buyer. Otherwise the design looks great with a thin bezel and frame, but sacrificing height adjustment is inexcusable in this price range. They sacrificed style over function there and that's a fatal flaw for many.
  • DotNetMaster777
    Dell Monitor looks nice !
  • photonboy
    Yep, VERY DISAPPOINTED with the "HDR" label. The main reason for it is to have excellent black levels for which you frankly would want at least 3000:1 likely and 10-bit support (or higher) for the better color range which helps prevent things like BANDING.