Entertainment Technology Explained

The Era of 4K TVs: Do You Really Need HDR?

Palash Volvoikar 26-12-2018

It’s the era of 4K Ultra HD TVs, and there is a mid-generation technology that has stepped in. We have had 4K UHD for a few years now, but the new addition to it is called High Dynamic Range, better known as HDR.


While it has been around for a while now, there is still some mystery around it. So do you really need HDR?

What Is HDR?

We have known High Dynamic Range as a feature in cameras, especially because of its heavy use in the marketing of smartphone cameras. Dynamic range refers to the difference between the darkest and brightest area of a scene, and a high dynamic range means this difference is large.

Basically, with HDR, a scene is processed keeping the different lighting and contrast of different areas of the scene in mind. This means the light and dark parts of the scene are processed differently. This leads to a more realistic reproduction of the scene.

HDR TVs became popular in 2016 HDR TV: What It Is & Why You'll Need It in 2016 High Dynamic Range, or HDR, is shaping up to be the big TV buzzword for 2016. But what is it? Will it live up to the hype? Here's all you need to know. Read More  when 4K was just starting to be accepted by consumers on a wide scale. Unlike generational technologies like 4K, however, HDR hasn’t become mainstream, and there are a few reasons for that.

The Double Standards: HDR10 and Dolby Vision

The technology industry has had a dilemma for ages. To break it down, HDR has two prominent standards right now, namely HDR10 and Dolby Vision. This is the prime reason why it might be too early for you to invest in an HDR display right now.


It’s HD DVD versus Blu-Ray, all over again. Whenever there are two standards for a technology competing for one slot in the market, it’s not a good idea to make a choice between the two. HDR is the free and open standard, whereas Dolby Vision is Dolby’s proprietary standard.

For a deeper insight, take a look at our HDR10 versus Dolby Vision comparison Dolby Vision vs. HDR10: What's the Difference Between HDR TV Formats? When you're buying a TV in 2017, HDR is the way forward. It is a must have feature, but there are two formats to choose from: Dolby Vision and HDR 10. What's right for you? Read More .

HDR10 is present in all HDR TVs, while only a few TVs support Dolby Vision. For now, both formats have their advantages and disadvantages, but the industry will eventually settle for one. All TVs that support Dolby Vision support HDR10, but not the other way around.

Additionally, Samsung has its own revision to HDR10, called HDR10+. This improves upon the flaws of HDR10 and puts it in direct competition with Dolby Vision. This makes it a fair competition, which means it might be a while before the industry settles on a standard.


The Lack of HDR Content

HDR is still a relatively new technology, and the content for the same hasn’t become mainstream yet. The problem of the double standards can be seen in HDR content as well.

Ultra HD TVs
Image credit: LG/Flickr

HDR for Video Content

As of now, HDR is being slowly adopted by content providers. The biggest name on the list is Netflix, which makes some of its original titles in 4K HDR. Amazon Prime Video also supports 4K and HDR for some titles. Both of these services support HDR10 and Dolby Vision.

When it comes to most other content, like 4K Blu-Rays, the industry prefers Dolby Vision. Dolby has a long association with premium content, especially Hollywood films. So almost any 4K HDR movie you can find on Blu-Ray is likely to be encoded in Dolby Vision.


HDR for Gaming

When it comes to gaming, HDR support is all over the place. On PCs, there is actually another standard in addition to Dolby Vision and HDR10. FreeSync 2 HDR is supported in few games and is AMD’s proprietary standard, designed only for their own GPUs.

In the case of consoles, PlayStation 4 Pro, Xbox One S, and Xbox One X, all support HDR. PS4 Pro supports only HDR10, as did both versions of the Xbox One. However, Microsoft added Dolby Vision support for its consoles in October 2018, but it has been painful for many users and works with only a few TVs.

If you’re in the market for a 4K HDR TV for gaming Should You Buy an HDR 4K TV for Gaming? With console compatibility with 4K & HDR becoming more common, should you buy an HDR 4K television for gaming yet? Here's an overview of how HDR 4K TVs currently work with gaming. Read More , you might need to do a little extra research.

HDR as a Marketing Gimmick

While HDR is an improvement, and ideally, a brilliant addition to 4K UHD TVs, it’s hard to determine which TV to go for, and not only because of the standards. The real challenge for users is because of manufacturers who all tend to go a little overboard with the HDR tag to attract more buyers.


An ideal HDR TV would need a 12-bit panel or at least a 10-bit panel. This is because HDR also aims at improving the vibrancy of content, by displaying more color.

An 8-bit panel technically shouldn’t even be sold as HDR, but since there is no set standard, manufacturers are doing it anyway. The bait is that these TVs are quite cheap, in comparison to full HDR TVs.

8-bit TVs can support HDR, but it can potentially look really bad. These TVs use a combination of other technologies and claim to achieve the contrast levels required, but there is no way to ensure that a TV with the HDR label will actually display HDR content well.

Meanwhile, some full HDR TVs with 10-bit panels simply don’t get bright enough to display HDR content well.

Additionally, 4K HDR content takes up more space, and bandwidth. So if you are someone that wants an HDR TV to stream Netflix, you will need a decent internet connection. HDR content at SD resolutions looks worse than non-HDR content does, especially on the cheaper TVs.

If you buy a cheap HDR TV and have a spotty internet connection, you are bound to have a bad streaming experience. Some of the cheaper TVs will not even let you turn off HDR, which is not ideal.

So, Do You Really Need HDR?

No. Not yet, at least.

While you can evaluate your use cases and do a significant amount of research to snag a decent 4K HDR TV, the technology still seems underdeveloped. Of course, there are TVs that support multiple standards of HDR and are seemingly future-proof.

However, since the standards are prone to undergo revisions, extra investment to get the best 4K HDR TV might not be worth it. Furthermore, the content itself is not quite there yet.

All things considered, it would be ideal to hold out on the purchase or go for a non-HDR 4K TV, especially if you have a tight budget. We’ve also looked at the differences between 4K and other resolutions How 4K TV Resolution Compares to 8K, 2K, UHD, 1440p, and 1080p What is 4K TV? Would you prefer 2K or 8K resolution? Here's what you need to know about HDTV resolutions and the number of pixels. Read More if you’re interested in more.

Related topics: 4K, Hardware Tips, HDR, Television, Ultra HD.

Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.

Whatsapp Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. Rod
    January 2, 2019 at 4:15 pm

    I went ahead and bought a 2016 4K tv because it was the last year that 4K tvs were made that had 3D. I really like my 3D and I realized this was the last chance to retain a 3D TV and also have 4K. My wife is happy as I told her this would be the last TV for quite a while. The 3D is amazing as with a 4K TV, both 3D images (left and right) are in full 1080P. So for me, getting a $K TV now was the way to go.

  2. Jason
    December 27, 2018 at 4:20 am

    Last month, I bought a 55" LG C8 (OLED) for the bedroom, along with an Apple TV 4K. With this particular TV, the difference between HDR and SDR is as pronounced as the difference between HD and SD was, back when 1080p first became really prevalent. It's that dramatic. I've actually started rebuying my movie library (which is primarily based in Vudu/Ultraviolet) in the iTunes store, specifically targeting movies that are available in 4K with HDR. I flat-out refuse to buy a movie now that's NOT in HDR. Fortunately, most new releases seem to be coming out in HDR (particularly Dolby Vision) and the more popular/big budget franchises from the past are available that way as well.

    So, while I guess I can concede that this article's conclusion is correct and that you don't really *need* HDR — of course, nobody *needs* it — if you're willing to invest in a TV that can take advantage of it (i.e. OLED, because my-god-those-black-levels), it's a tremendously good viewing experience.

    • James Segrove
      December 27, 2018 at 10:56 am

      I absolutely agree with the comment above. I bought one of first the UHD premium TV'S at the start of 2017 and it was and still is jaw dropping compa. I can not tell how how important this technology is and if you're going to drop a load of money on a new tv, it must have decent HDR for the full impact that TV has to offer now.