Dell S2718D HDR Monitor Review

Grayscale, Gamma & Color

Grayscale Tracking

Two of the S2718D’s image modes will make the color temp warmer or cooler. And the remaining presets use variations that aren’t quite D65. Standard is the default setting, so that’s where we’ll begin our tests.

Our grayscale and gamma tests are described in detail here.

In the first chart, you can see that blue is clipped enough for most brightness levels to have a red/green tint that becomes more visible as output increases. The Cool setting goes too far in the other direction, making everything look blue. Switching to Custom Color produces a similar result, so the only way to maximize the S2718D’s potential is to adjust the RGB sliders. Calibration produces excellent tracking with errors well below the visible threshold. If you dial in our recommended settings, you’ll be able to duplicate our chart closely.


The S2718D makes quite a gain from its default to calibrated state. 5.41dE is one of the higher out-of-box errors we’ve seen of late. But the final .63dE average error is quite low. So, while it’s a bit weak at first, the potential for greatness is there in the Custom Color mode. And as you’ll see on page five, HDR accuracy is very good.

Gamma Response

Luminance tracking is a key component to proper HDR rendering, so it’s good to see the S2718D excelling in that area for SDR content. Tracking is nearly perfect by default and after calibration. There are a few tiny dips in the Custom Color mode pre-adjustment but our changes cleaned that up nicely. Gamma is perhaps the most important thing to get right on any monitor, so we’re glad to see Dell’s attention to detail in this area.


With a tight .14 value range and a slight .45% deviation from the 2.2 standard, the S2718D takes the gamma comparison over its non-HDR competition. There are no gamma adjustments available, so that’s a very good thing.

Color Gamut & Luminance

For details on our color gamut testing and volume calculations, please click here.

Thanks to accurate gamma tracking, saturation and luminance targets are mostly met even in the S2718D’s default Standard mode. We can see slight under-saturation in blue and a hue error in cyan, but other colors are fine. This is important, since Ultra HD/HDR content is usually mastered in the wider DCI-P3 color gamut. Because we’re working with an sRGB monitor, it must track accurately to properly render detail when the content’s colorspace is larger than the panel’s. You’ll see what we mean on the next page where we’ll show you the extended gamut test results.


We weren’t thrilled with the S2718D’s default grayscale numbers, but without calibration, its color gamut error averages just 3.66dE. Afterwards, that value drops to a very-respectable 1.59dE. The adjustments are clearly worth making, even if you just dial in our recommended settings.

Gamut volume is an almost perfect 99.34% of the sRGB space. You can use this monitor for color-critical applications if it’s properly adjusted. The native gamut is pretty much right on target.

MORE: Best Gaming Monitors

MORE: Best Professional Monitors

MORE: How We Test Monitors

MORE: How To Choose A Monitor

MORE: All Monitor Content

This thread is closed for comments
    Your comment
    cant wait till they have 4k in hdr at affordable prices
  • daglesj
    I'll take the features but ditch the stand for a standard removable with VESA and the controls built in as normal.
  • GentlemanGreen
    stopped reading at 60 hz
  • LionD
    How could 8 bit/sRGB display, with contrast 1000 and no local dimming, deliver true HDR experience? Total nonsense.
  • CarbonBased

    Lots of people have plenty of use for 60hz screens. stop poo-pooing products that cleary arent aimed at you. I have a rig for gaming, and sure, 60hz isn't really enough anymore. However, I take and edit photos as a hobby, so IPS, 10-bit, HDR, all very attractive features. Add that i can mate it to my photo editing laptop with a USB-C cable and were really getting somewhere. I'll be looking for this one come holiday season.
  • cbliss
    NOT AN HDR MONITOR.. FALSE ADVERTISING.. BUYERS BEWARE!! (HDR requires 10bit panel, this is 8bit.. It also lacks any form of local dimming). Bogus product for hdr, otherwise simply an overpriced QHD monitor).
  • CarbonBased
    Fair enough, I didnt realize that it was 8 instead of 10 bit. But I will stand by my point that 60Hz is fine for many, if not most, computer users, even if they are gamers. the market for high refresh rates is specifically gamer-centric. Dissing product thats arent built to gamer spec because you are a gamer does not lend one to being an unbiased source of opinion.
  • Scott____67
    i like to wall mount my monitor anyways so the stand is non existent and in a condo it keeps areas and desk spaces clear plus having a little height with a downward pitch is perfect for the lean back in the chair gamer that i am
  • alextheblue
    2460450 said:
    Fair enough, I didnt realize that it was 8 instead of 10 bit. But I will stand by my point that 60Hz is fine for many, if not most, computer users, even if they are gamers. the market for high refresh rates is specifically gamer-centric. Dissing product thats arent built to gamer spec because you are a gamer does not lend one to being an unbiased source of opinion.

    Agreed. A 60hz monitor isn't great for gaming anymore, so for my personal needs and budget I'm better off with a halfway decent TN panel with high refresh rate, wide freesync range, and low input lag. That might change in the future, as advanced displays come down in price. But today that's what best fits my needs.

    But as you said most non-gaming applications don't need high refresh rates. Users who don't game will typically favor resolution, contrast, brightness, viewing angle, and color reproduction over refresh rate and input latency. If you have a sub-$300 budget like I do you often end up with a display that either favors gaming performance and features, or image quality and advanced colorspaces. Just because you favor a high-refresh gaming monitor doesn't mean you can't recognize uses for a non-gaming display.

    Granted if you spend enough money you can get a display that doesn't compromise much and is fairly good at everything. Way out of my price range at this point, though.
  • alextheblue
    To see HDR content, you’ll need a compatible player or computer with an HDMI 2.0/HDCP 2.2 output. The latest Ultra HD Blu-ray players feature this interface. You can also connect with the right video card. Fortunately, there are quite a few choices. On the Nvidia side is the GTX 950 up to the Titan X (Maxwell), or the GTX 1050 to Titan X (Pascal). AMD users can employ an R9 390X or RX 460, 470, or 480.

    I thought anything with Polaris would support HDR10, such as Radeon 540/550 (Polaris 12). Maybe I'm misremembering. Also, on PC you have to have to use HDR10 compatible playback software to benefit.

    On the console side of things, Xbox One S has supported Ultra HD (4K HDR10) BDs for some time. If I was looking for a dedicated box, it's a good choice even if you don't play console games. It's not much more than a decent dedicated 4K HDR10 player, and it has better support for apps. You can add a Kinect if you want voice control. If you don't use physical discs but want a dedicated box for 4K HDR streams, then I'd recommend a Roku Premiere+ or Ultra.
  • rantoc
    This monitor is as much HDR as the consoles are 4k capable ;)
  • i-am-i-u-r-u
    Way over priced and phony marketing as HDR. The after calibration black luminance of .2663 and contrast ratio of 761 is pitiful.
  • cryoburner
    142182 said:
    I'll take the features but ditch the stand for a standard removable with VESA and the controls built in as normal.

    I don't get why they would give this relatively high-end monitor such a mediocre non-removable stand and no VESA mounts. It might make the monitor look relatively nice when viewed from the back and side, but that's not likely to be relevant in most usage scenarios, where the back will be facing toward a wall. Making the monitor slightly thinner is largely pointless when its at the expense of functionality.

    Also, the monitor's main feature seems only half-implemented. It's supposedly an "HDR monitor", but has weak static contrast ratios. Maybe it will often look better than a standard IPS screen when fed with HDR content, but a VA panel would probably look better overall, even without support for 10-bit input. This monitor reminds me of those standard-definition televisions from a decade ago that would accept an HD signal, but then downsample it to the screen's SD display, only here, we seem to be taking a high dynamic range image and displaying it on a screen with mediocre contrast. It might be a decent monitor for what it is, but it seems a bit overpriced considering what it has to offer.
  • Nintendork
    I wouldn't touch any shi*tty monitor with poor 1000:1 contrast when 3000:1 AMVA+ are available, at least till OLED floods PC.
  • bit_user
    I feel like I've been reading about >= 10-bit and HDR for like 10 years. HDMI has supported 30-bit (10 bits per channel) and 36-bit (12 bits per channel) "deep color" for about that long, and I thought there were supposed to be monitors that supported it.
  • jn77
    My Still camera's record in 14bit Raw, Photoshop and Capture One work with 12 and 14bit files. My video camera's record in 12bit raw. I don't own a business and I am not going to spend $6000 on a computer monitor.

    240 Htz 3D 10,12 and 14 bit panels are way over priced and need to come down. The same with OLE.

    All it is is a monitor, not even a TV. It is a dead screen to display anything you throw at it.

    Look at how TV's depreciate. All it is, is a system rigged to make you pay for stuff that is already obsolete and keep consumers on the hook for upgrades.
  • shrapnel_indie
    205977 said:
    I wouldn't touch any shi*tty monitor with poor 1000:1 contrast when 3000:1 AMVA+ are available, at least till OLED floods PC.

    All manufacturers tend use different formulas to calculate contrast ratio, and frequently manufacturers change their own formulas. This kind of makes contrast ratios irrelevant, unless this practice has changed and a universal formula has been agreed upon.
  • shrapnel_indie
    Lets see...

    IPS: Good
    gtg refresh: 6ms - outside what is considered good for gaming (5ms or less)
    Framerate: 60 fps - bare minimum to look at for gaming, and that being only for the really low budget concerns.

    No, not a gaming monitor.
    No, not quite a Developer's monitor.
    Office monitor? Too pricey.
    Executive's monitor? Now I think we have something. A monitor aimed to cater to the hipster exec who wants to look good flaunting the latest tech.
  • ceberle
    I tried to be clear in the review that this monitor correctly processes HDR signals but with its low native contrast and edge backlight, it doesn't deliver an optimal HDR experience. There are plenty of televisions that offer similar performance.

    The only way an LCD panel will do justice to HDR is with a zone-dimming backlight. I've recently seen demos of upcoming screens from Asus that have this feature. They look stunning to say the least. I also have the UP2718Q from Dell that has a 384-zone backlight with 1000nits, 10-bit color, and Ultra HD. That review will appear soon.

    I realize the S2718D is an early effort. It'll only get better from here!

  • JTWrenn
    Why would anyone buy this when a 32" 4k hdr monitor from samsung is right around the corner for $699? Uh850 seems like a much better deal.
  • 10tacle
    This is a fraudulent marketing gimmick that attempts to mimic HDR. While there are no definitive IEEE-type standards for "HDR", the fact is that real HDR panels (HDTVs as well as PC monitors) have higher caliber panels. Specifically starting with being a 10-bit panel and then on to color gamut, brightness, and contrast ratio expectations.

    While there are no official standards of HDR, there are three generally recognized standards of HDR: HDR10, Dolby Vision, and the newest is Samsung's HDR10+. HDR10 is the "official" supported one by the UHD Alliance and is being pushed to make as standard similar to Blu-Ray winning over HD DVD format. They put their stamp of approval as "Ultra HD Premium" on the products.

    This monitor fails on everything and should not even have "HDR" in the description. Fraud if you ask me and I'm very disappointed as a long time Dell panel buyer. Otherwise the design looks great with a thin bezel and frame, but sacrificing height adjustment is inexcusable in this price range. They sacrificed style over function there and that's a fatal flaw for many.
  • DotNetMaster777
    Dell Monitor looks nice !
  • photonboy
    Yep, VERY DISAPPOINTED with the "HDR" label. The main reason for it is to have excellent black levels for which you frankly would want at least 3000:1 likely and 10-bit support (or higher) for the better color range which helps prevent things like BANDING.