It's the era of 4K Ultra HD TVs, and there is a mid-generation technology that has stepped in. We have had 4K UHD for a few years now, but the new addition to it is called High Dynamic Range, better known as HDR.

While it has been around for a while now, there is still some mystery around it. So do you really need HDR?

What Is HDR?

We have known High Dynamic Range as a feature in cameras, especially because of its heavy use in the marketing of smartphone cameras. Dynamic range refers to the difference between the darkest and brightest area of a scene, and a high dynamic range means this difference is large.

Basically, with HDR, a scene is processed keeping the different lighting and contrast of different areas of the scene in mind. This means the light and dark parts of the scene are processed differently. This leads to a more realistic reproduction of the scene.

HDR TVs became popular in 2016 when 4K was just starting to be accepted by consumers on a wide scale. Unlike generational technologies like 4K, however, HDR hasn't become mainstream, and there are a few reasons for that.

The Double Standards: HDR10 and Dolby Vision

The technology industry has had a dilemma for ages. To break it down, HDR has two prominent standards right now, namely HDR10 and Dolby Vision. This is the prime reason why it might be too early for you to invest in an HDR display right now.

It's HD DVD versus Blu-Ray, all over again. Whenever there are two standards for a technology competing for one slot in the market, it's not a good idea to make a choice between the two. HDR is the free and open standard, whereas Dolby Vision is Dolby's proprietary standard.

For a deeper insight, take a look at our HDR10 versus Dolby Vision comparison.

HDR10 is present in all HDR TVs, while only a few TVs support Dolby Vision. For now, both formats have their advantages and disadvantages, but the industry will eventually settle for one. All TVs that support Dolby Vision support HDR10, but not the other way around.

Additionally, Samsung has its own revision to HDR10, called HDR10+. This improves upon the flaws of HDR10 and puts it in direct competition with Dolby Vision. This makes it a fair competition, which means it might be a while before the industry settles on a standard.

The Lack of HDR Content

HDR is still a relatively new technology, and the content for the same hasn't become mainstream yet. The problem of the double standards can be seen in HDR content as well.

Ultra HD TVs

HDR for Video Content

As of now, HDR is being slowly adopted by content providers. The biggest name on the list is Netflix, which makes some of its original titles in 4K HDR. Amazon Prime Video also supports 4K and HDR for some titles. Both of these services support HDR10 and Dolby Vision.

When it comes to most other content, like 4K Blu-Rays, the industry prefers Dolby Vision. Dolby has a long association with premium content, especially Hollywood films. So almost any 4K HDR movie you can find on Blu-Ray is likely to be encoded in Dolby Vision.

HDR for Gaming

When it comes to gaming, HDR support is all over the place. On PCs, there is actually another standard in addition to Dolby Vision and HDR10. FreeSync 2 HDR is supported in few games and is AMD's proprietary standard, designed only for their own GPUs.

In the case of consoles, PlayStation 4 Pro, Xbox One S, and Xbox One X, all support HDR. PS4 Pro supports only HDR10, as did both versions of the Xbox One. However, Microsoft added Dolby Vision support for its consoles in October 2018, but it has been painful for many users and works with only a few TVs.

If you're in the market for a 4K HDR TV for gaming, you might need to do a little extra research.

HDR as a Marketing Gimmick

While HDR is an improvement, and ideally, a brilliant addition to 4K UHD TVs, it's hard to determine which TV to go for, and not only because of the standards. The real challenge for users is because of manufacturers who all tend to go a little overboard with the HDR tag to attract more buyers.

An ideal HDR TV would need a 12-bit panel or at least a 10-bit panel. This is because HDR also aims at improving the vibrancy of content, by displaying more color.

An 8-bit panel technically shouldn't even be sold as HDR, but since there is no set standard, manufacturers are doing it anyway. The bait is that these TVs are quite cheap, in comparison to full HDR TVs.

8-bit TVs can support HDR, but it can potentially look really bad. These TVs use a combination of other technologies and claim to achieve the contrast levels required, but there is no way to ensure that a TV with the HDR label will actually display HDR content well.

Meanwhile, some full HDR TVs with 10-bit panels simply don't get bright enough to display HDR content well.

Additionally, 4K HDR content takes up more space, and bandwidth. So if you are someone that wants an HDR TV to stream Netflix, you will need a decent internet connection. HDR content at SD resolutions looks worse than non-HDR content does, especially on the cheaper TVs.

If you buy a cheap HDR TV and have a spotty internet connection, you are bound to have a bad streaming experience. Some of the cheaper TVs will not even let you turn off HDR, which is not ideal.

Related: How to Start Watching HDR Content on Windows 10

So, Do You Really Need HDR?

No. Not yet, at least.

While you can evaluate your use cases and do a significant amount of research to snag a decent 4K HDR TV, the technology still seems underdeveloped. Of course, there are TVs that support multiple standards of HDR and are seemingly future-proof.

However, since the standards are prone to undergo revisions, extra investment to get the best 4K HDR TV might not be worth it. Furthermore, the content itself is not quite there yet.

All things considered, it would be ideal to hold out on the purchase or go for a non-HDR 4K TV, especially if you have a tight budget. We've also looked at the differences between 4K and other resolutions if you're interested in more.