Skip to main content

Dolby Vision vs. HDR10 vs. HDR10+: Which HDR format is the best?

High Dynamic Range (HDR) technology has become an important term to consider when purchasing a new TV or streaming TV shows and movies because HDR technology makes content more vibrant and brighter.

While there are five different HDR formats, we’re going to focus on the top three in terms of widespread availability: HDR10, Dolby Vision, and HDR10+. Each one of these formats brings something different to the table and could enhance your viewing experiences dramatically. This guide will show you the difference between the formats and how these differences might affect your buying decision regarding a new TV or streaming device.

What is HDR?

A couple watches TV.

HDR represents the unlocking of more colors, higher levels of contrast, and significant brightness. With a broader color gamut over Standard Dynamic Range (SDR), HDR opens up more hues that TVs could not produce for so many years. As a result, HDR allows for more realistic content to be presented on today’s 4K TVs. SDR can only present a fraction of the color depth that HDR can. For example, SDR displays can only showcase 256 shades of red, green, and blue, while HDR can showcase 1,024 shades.

For HDR content to be displayed, an HDR-capable TV and a source of HDR content, be it a media player like a Blu-ray player or a streaming service like Netflix or Hulu, is needed.

Further reading

Image quality

HDR’s big advantage is better image quality. Whether you’re watching the latest Marvel film or playing a game on your Xbox console, better image quality will improve your overall experience. But how exactly does HDR deliver that improved picture quality when compared with SDR?  The answer lies in three elements: Bit depth, brightness, and metadata. Let’s take a quick look at each one and how the different HDR formats use them.

A woman on a couch watches TV.

Bit depth

Bit depth describes the number of colors a movie or TV show includes as well as the number of colors a TV can display. Each pixel of your TV is made up of three discrete colors: Red, green, and blue (RGB). Each of these colors can be broken down into hues. The greater the bit depth, the greater the number of hues you get, and thus the greater the number of colors.

SDR content, for instance, uses a bit depth of 8 bits. Eight bits allows for up to 256 hues of R, G, and B. If you multiply 256 x 256 x 256, you get 16.7 million possible colors. That sounds like a lot, but when you look at HDR10 and HDR 10+ formats — which make use of 10 bits and can display up to 1.07 billion colors — it’s easy to see that HDR is much more colorful. Dolby Vision takes that up a notch with 12 bits, for a maximum of 68.7 billion colors.

While TVs that can handle 10-bit color are quite common, there are no TVs that support 12-bit color yet. So Dolby Vision’s massive color advantage is going to be moot for the time being.

Brightness

TV brightness is measured in candelas per square meter (cd/m²) or nits (1 nit is roughly equal to 1 cd/m²). When it comes to peak brightness, Dolby Vision takes the crown. It can support display brightness of up to 10,000 cd/m², whereas both HDR10 and HDR10+ max out at 4,000 cd/m². For now, you’ll be hard-pressed to find TVs that can deliver anywhere close to 4,000 cd/m², but every year TVs get brighter, and this makes Dolby Vision more future-proof than other formats.

Metadata

Metadata, in the context of HDR, is an extra layer of information that tells a TV how it should display the content it’s receiving. This information covers things like peak brightness, contrast, and something known as tone mapping, all of which contribute to making HDR video look so much better than SDR. However, not all HDR formats use the same kind of metadata. HDR10 uses static metadata, which means the information governing brightness, contrast, etc., is delivered to the TV at the beginning of a movie or TV show and doesn’t change until you switch to a new movie or TV show.

Dolby Vision and HDR10+, on the other hand, use dynamic metadata. This allows each scene — or even each frame of video — to be properly adjusted for the best results.

Availability

A gamer using one of LG's TVs.
LG

HDR10 is considered the default HDR format. This means that if a movie is presented in HDR, a streaming media device claims to support HDR, or a TV is marketed as an HDR TV, they will all support HDR10 at a minimum. This near-universal support puts HDR10 heads and shoulders above Dolby Vision and HDR10+ when it comes to availability both in content and devices.

But Dolby Vision, once considered a hard-to-find premium option, is quickly catching up to HDR10. You’ll find Dolby Vision support on plenty of HDR TVs with the exception of Samsung TVs. Samsung continues to be the one TV maker that refuses to pay Dolby’s licensing fees for Dolby Vision. Dolby Vision content is also becoming more commonplace. You’ll find it on UHD Blu-ray discs as well as streaming services such as Disney+, Apple TV+, Netflix, and Amazon Prime Video, just to name some of the most popular options.

Pulling up the rear is HDR10+. Despite its royalty-free licensing, it has the lowest adoption among TV makers. In the U.S., you’ll find it on all Samsung HDR TVs and select Vizio and Hisense models. Elsewhere in the world Panasonic, TCL, Toshiba, and Philips sell HDR10+-capable TVs. And while your TV may support HDR10+, finding content in this format could prove challenging. Right now, Amazon Prime Video is one of the few reliable sources of HDR10+ material.

Conclusion

With better brightness, color, and the benefits of dynamic metadata, Dolby Vision is clearly the best HDR format. It’s supported on TVs from LG, Vizio, TCL, Hisense, and Sony, and you can find it on an increasing number of the top streaming services.

However, HDR10, as the de facto HDR format, remains the most accessible of the three both from a content and device point of view. HDR10+ may offer many of the same benefits as Dolby Vision, but its slow adoption among TV makers and content creators/distributors makes it more of a footnote in the HDR space than a truly viable alternative to HDR10 and Dolby Vision, though that could easily change in the future: There are few if any roadblocks to its broader adoption.

But here’s the good news: HDR formats aren’t mutually exclusive. If you buy a TV that doesn’t support Dolby Vision, you can still watch HDR10 (or HDR10+ if applicable). And because streaming providers can offer multiple versions of HDR per movie, your TV should automatically receive the highest quality format that it supports. For instance, if you stream a Marvel movie from Disney+ that is available in Dolby Vision, your TV will still get the HDR10 version if that’s the best format it can display.

Editors' Recommendations

James Holloway
James is a techie at heart; he covers Home Theater and Gaming for Digital Trends with a focus on how to get the best out of…
The 2020 Tokyo Games will be the first in Dolby Vision and Dolby Atmos
A TV showing the Comcast Xfinity X1 interface for the 2020 Tokyo Games.

Because of the ongoing coronavirus pandemic, the 2020 Tokyo Summer Games will be the first to have zero spectators in the stands. And while that's a huge disappointment for both local and visiting sports fans, these games will actually be among the best to watch remotely. For the first time, much of the Summer Games will be broadcast in both Dolby Vision and Dolby Atmos, but there's a slight catch: You'll need to be a Comcast Xfinity X1 customer with compatible equipment in order to enjoy the high dynamic range video and immersive audio offered by these two Dolby technologies.
What you'll need to watch

Regular coverage of the 2020 Games will be available from a variety of sources, but if you want to watch in Dolby Vision and Dolby Atmos you'll need:

Read more
LG goes after gamers with 120Hz Dolby Vision for C1 and G1 series OLED TVs
A gamer using one of LG's TVs.

LG is going after gamers with new features for its C1 and G1 series OLED TVs that are aimed at offering the best possible visual experience.

In an announcement on Monday, June 28, the Korean tech company said its latest firmware (03.15.27) makes LG’s C1 and G1 TVs the first in the world to support Dolby Vision HDR at 4K 120Hz on select gaming platforms.

Read more
What is HDR10+ Adaptive? The HDR calibration system fully explained
what is hdr10plus adaptive samsung neo qled 8k hdr10 plus

In the closing days of 2020, Samsung announced that its new TVs for 2021 -- which were introduced a few days later at CES 2021 -- will include a technology called HDR10+ Adaptive. If it sounds familiar, that's likely because it's built on top of HDR10+, a high dynamic range (HDR) format that Samsung has been championing for several years. But there's more to HDR10+ Adaptive than HDR, so let's dig into all of the details.

First, a quick primer on HDR as a technology. Though there are several competing HDR formats -- something that is creating increasing confusion for buyers -- they all strive for the same goal, which is to make movies and shows look way better through increased brightness, increased colors, and increased image accuracy. When all of the necessary ingredients come together, it can look stunning.

Read more