The way we see things has changed a lot since the days of rabbit-ear antenna televisions. It’s safe to say we have a clearer vision of the future, and it seems that with every new year comes a new resolution. With these ever-expanding pixel counts, it can be hard to keep up — but not to worry: Not only do these sharper resolutions heighten your future viewing experience, but the costs also tend to come back down to earth once a new standard is set.
High-dynamic range, more commonly referred to as HDR, is one of the most important new video technologies since the upgrade from standard definition to HD. But HDR comes in many flavors. You’ve probably heard terms like Dolby Vision, HDR10, HLG, or more recently, HDR10+. But what exactly is HDR10+? How can you get it? And perhaps most importantly, is it the best HDR format? We’re glad you asked! Below, we’ll shed some much-needed light on all of these questions and more.
What is HDR?
Before we can dive into HDR10+, we need to make sure we understand HDR. We’ve got a few fantastic deep dives on this technology that you can peruse at your leisure, but for the sake of a quick introduction, high-dynamic range as it pertains to TVs allows for video and still images with much greater brightness, contrast, and better color accuracy than what was possible in the past. HDR works for movies, TV shows, and video games. Unlike increases in resolution (like 720p to 1080p), which aren’t always immediately noticeable — especially when viewed from a distance — great HDR material is eye-catching from the moment you see it.
HDR requires two things at a minimum: A TV that is HDR-capable and a source of HDR video, like a 4K HDR Blu-ray disc and compatible Blu-ray player, or an HDR movie on Netflix. Confused consumers often conflate 4K and HDR, but they are very different technologies; not all 4K TVs can handle HDR, and some do it much better than others. That said, most new TVs support both 4K UHD and HDR.
But saying “HDR” is like saying “digital music”: There are several different types of HDR, and each has its own strengths and weaknesses.
What is HDR10?
Every TV that is HDR-capable is compatible with HDR10. It’s the minimum specification. The HDR10 format allows for a maximum brightness of 1,000 nits (a measure of brightness), and a color depth of 10 bits. On their own, those numbers don’t mean much, but in context they do: Compared to regular SDR (standard dynamic range), HDR10 allows for an image that is over twice as bright, with a corresponding increase in contrast (the difference between the blackest blacks and the whitest whites), and a color palette that has one billion shades, as opposed to the measly 16 million of SDR.
As with all HDR formats, how well HDR10 is implemented depends upon the quality of the TV on which you view it. When utilized properly, HDR10 makes video content look really good, but it is no longer the top of the HDR food chain.
What is HDR10+?
As the name suggests, HDR10+ takes all of the good parts of HDR10 and improves upon them. It quadruples the maximum brightness to 4,000 nits, which thereby increases contrast too. But the biggest difference is in how HDR10+ handles information. With HDR10, the “metadata” that is fed by the content source is static, which means there’s one set of values established for a whole piece of content, like an entire movie. HDR10+ makes this metadata dynamic, allowing it to change for each frame of video. This means every frame is treated to its own set of colors, brightness, and contrast parameters, making for a much more realistic-looking image. Areas of the screen that might have been oversaturated under HDR10 will display their full details with HDR10+. But wait, there’s more — Samsung, long a proponent of HDR10+, has kicked things up yet another notch. The company’s HDR10+ Adaptive technology allows your TV to detect the brightness of your viewing space and make micro adjustments to the brightness, contrast, etc., in response to changes in the room.
The catch here is that, despite being a royalty-free format, HDR10+ was developed by a consortium of three companies — 20th Century Fox, Panasonic, and Samsung. As such, HDR10+ compatibility has so far been limited to TV models by Samsung and Panasonic. And here’s catch number two: On the content side of the equation, there isn’t a lot of support for HDR10+, though that’s changing slowly but surely.
Amazon Prime Video supports the tech, while Netflix and many other streamers have yet to adopt it. In April 2019, Universal made a commitment to release both new and back-catalog titles in HDR10+, but 20th Century Fox, once set to do the same, now looks to be cozying up to Dolby Vision. Many industry insiders saw this coming, as the company is owned by Disney, which has embraced Dolby Vision with open arms.
So … what about Dolby Vision?
HDR10+ isn’t the only HDR format with ambitions of becoming the next king of the HDR castle. Dolby Vision is an advanced HDR format created by Dolby Labs, the same organization behind the famous collection of Dolby audio technologies like Dolby Digital and Dolby Atmos. Dolby Vision is very similar to HDR10+ in that it uses dynamic, not static, metadata, giving each frame its own unique HDR treatment. But Dolby Vision provides for even greater brightness (up to 10,000 nits) and more colors, too (12-bit depth, for a staggering 68 billion colors).
For now, these specs are a bit moot: There are no 12-bit-depth-capable TVs yet, and brightness of that caliber remains the stuff of prototypes. But both are certainly coming in the years ahead, and Dolby Vision is ready and waiting. Unlike HDR10+, which only had its official launch in 2018 and has so far seen limited uptake by both content and hardware companies, Dolby Vision has been around for several years and enjoys wide industry support, which could help make it the HDR standard.
Part of the reason Dolby Vision is less abundant than HDR10 is the fact that it’s a proprietary technology, and companies that wish to implement it in content or hardware must pay a licensing fee to do so. HDR10+, like its predecessor HDR10, is open source and royalty free, which could cause its adoption rate to explode in the coming years, especially among budget-friendly TV manufacturers.
Oh no, not another format war!
Does the presence of competing HDR formats like HDR10+ and Dolby Vision mean we’re in for another format war? Not exactly. Unlike previous tech tiffs like Blu-ray versus HD-DVD, HDR formats are not mutually exclusive. This means there’s nothing stopping a movie studio from releasing a Blu-ray that contains HDR10, HDR10+, and Dolby Vision metadata on a single disc.
A TV that supports HDR can support multiple HDR formats, and many of today’s TVs do just that. The most common combo is HDR10 and Dolby Vision support on a single TV; however, we’re also just beginning to see the arrival of TVs that add HDR10+ and even HLG (the version of HDR favored by digital TV broadcasters) to that mix. It’s also possible that some TVs that shipped from the factory with support for just two formats — say HDR10 and Dolby Vision — could be updated via a firmware upgrade to handle HDR10+.
Blu-ray players and media streamers can also support multiple HDR formats. The challenge is that, despite the ability to support multiple HDR formats, very few TVs, playback hardware devices, streaming video services, or Blu-rays actually do. This means that, as consumers, we need to pay close attention to the labels to understand the capabilities of the devices and content we own — and the ones we plan on buying.
Many Blu-ray players, for instance, only offer support for HDR10, while some newer ones, like Sony’s UBP-X800M2, add Dolby Vision support. The same considerations apply to set-top streaming boxes. At the moment, there only three streaming devices that can handle HDR10, HDR10+, and Dolby Vision: Chromecast with Google TV, Amazon’s Fire TV Cube, and the Fire TV Stick 4K — not surprising given that Amazon’s Prime Video service also supports all three formats. Others, like Apple’s Apple TV 4K, support HDR10 and Dolby Vision but not HDR10+.
What equipment do I need to get HDR10+?
To summarize, HDR10+ is a new format of HDR that offers higher levels of brightness and contrast plus more true-to-life colors and detail. To get it, you’ll need:
- A source of HDR10+ video, such as a Blu-ray movie or Amazon Prime Video (with more to follow)
- A device that is capable of reading HDR10+ encoded material, like a compatible Blu-ray player or media streamer
- A TV that is HDR10+ compatible (these may also have built-in apps that let you sidestep the need for a playback device)
One more thing: If you’re using a media streamer or a Blu-ray player for your HDR10+ content and it does not plug directly into your TV, the HDMI cable that you’re using should ideally be compatible with HDMI 2.1. The reason is that HDR10+ (and Dolby Vision) use far more data bandwidth than conventional HDR10, and older HDMI 2.0 cables may not be able to support that extra demand.
So that’s that! Whether you’re looking to upgrade your home theater system or you just want to understand this cool tech, that’s really all you need to know. Stay tuned for updates!
Editors' Recommendations
- Bluetooth on Sonos’ new Era speakers isn’t what you think – it’s better
- What is Sonos? The speakers, app, and everything you need to know about wireless music
- 11 Sonos tips, tricks, and little-known features
- The best wireless headphones for 2023: which should you buy?
- The best soundbars 2023: which should you buy?