What is HDR and is HDR worth it?
Categories
Advanced Cinematography Techniques

What is HDR and is HDR worth it?

The simplest explanation of what HDR is and whether it is worth the trouble at this time.

In this video and article I’ll explain HDR as simply as I can, so you can separate the truth from the marketing speak:

Exclusive Bonus: Download your FREE Blueprint: How to make a movie. A complete visual representation of the filmmaking process from beginning to end.

Is HDR simple to understand?

Once we move past the basic understanding stage, we will have to dig deeper to find out if HDR is worth the trouble at this point in time. To really understand and get involved with HDR, you need to have a solid grasp of some concepts like luminance and nits, Rec. 709, Rec. 2020, stops of light, ACES, waveforms and so on. I’ve already written extensive articles on these subjects so please read them first to get up to speed:

The simple truth is you can’t master HDR without knowing these things.

What is HDR?

HDR stands for high dynamic range. What we’re currently used to seeing on TV is called SDR or standard dynamic range.

The difference between SDR and HDR is the details in the shadows and details in the highlights. Here’s a simulated shot of a scene that explains the difference:

Nits and Stops

A monitor outputs light, and this is measured in nits. The more the nits the brighter the monitor.

Nits are not the brightness setting you see on monitors. But it’s a linear scale. You can also measure light logarithmically, in stops. In cinematography and filmmaking it’s more practical to understand light in terms of stops, and then apply it to nits, just for reference.

So why is this important? Imagine you’re shooting the sun. A monitor can only give you the perception of looking out a window if the brightness of the sun in it is equal to the brightness of the real sun.

This is not easy, and is not something we should expect from displays anytime in the near future. The best HDR can hope for is come close enough to give us the same emotional response. So if replicating the sun is a far-fetched dream with today’s technology, then what exactly can we match?

The standard we’ve all been used to for high-definition TV is HDTV, or Rec. 709. According to SMPTE RP166/167 (Critical Viewing Conditions for Evaluation of Color Television Pictures) the peak luminance of a monitor fully supporting it should reach 120 nits (or 35 foot lamberts).

What about stops? How much dynamic range can the Rec. 709 standard really hold? It can hold about 7 stops. One hopes HDR should be able to display more than 7 stops.

Therefore, in theory, anything more than 7 stops is HDR. For simplicity’s sake, let’s assume 7 stops is equal to 120 nits. Every time you want to increase a stop, you have to double the nits. For example, for 8 stops, you need 240 nits. For 9 stops you need 480 nits and for 10 stops you need 960 nits.

However, we must not assume that black is zero nits. There are OLED displays that can claim to be near zero nits, but even that is not true black. True black is another thing entirely. If HDTV is 7 stops, where the brightness peaks at 120 nits, the black level should be roughly at about 0.5 nits.

It’s easy for these numbers to be overwhelming without some context, so let me give you some perspective.

Are HDR monitors better than high-end standard monitors?

Your standard computer monitors can easily do about 120 nits, and high-end monitors can double that. An iPad Mini 4 outputs about 450 nits with black levels at 0.5 nits. An iMac gives out roughly 300-400 nits, with black levels at 0.4 nits. So you can see even consumer displays are already HDR by an easy two stops over Rec. 709.

Moving on to cameras. An old Canon 550D could record more than 10 stops in video mode. A Sony a7S II can do between 13 and 14 stops in log mode. An Arri Alexa can go over 14 stops. From the cheapest DSLRs to the most expensive cameras, they are all HDR already. So HDR isn’t really a new thing. It’s just old wine in a new bottle.

The bottle or bottleneck here being monitors.

So what’s changed with monitors that suddenly make them more desirable? The manufacturers would like you to believe they are all HDR capable today. Normal monitors can do about 9 stops, so these special HDR monitors should do a little more to earn their price tag, right?

Here’s the first disappointing news. Most high-end HDR monitors for home viewing peak at about 1,000 to 2,000 nits. Wait a minute, how can that be disappointing news?

The sustained peak luminance

When you read the term peak luminance in nits you have to be on your guard. A monitor has millions of pixels, depending on its resolution. Consumer HDR monitors deliver this advertised peak luminance either for a short duration or over a short area. E.g., the most cited is the 2% or 10% window, which means if they advertise 1,000 nits you only get this over an area of 2% or 10% or whatever % of the entire 100% area of the monitor. 25% might give you a bit less, and 100% will be lesser. It cannot sustain the peak luminance for long, so we have yet another variable to consider, the peak sustained luminance at 100% of the screen, because this is the most important parameter that tells us a monitor is really capable of any HDR you can throw at it:

Now, a caveat. Peak brightness over 2% or 10% isn’t a bad thing, because many times a bulb or sun in the shot might only take up that much room. Yet, on the whole, this is not why you would pay upwards of $5,000 for these HDR TVs. These high-end television sets are beyond the purchasing power of most individuals, and yet on average you get only about 500 nits sustained over a 100% area.

Most average HDR monitors that fall in the thousand dollar range have much lower peak luminances, somewhat just better than the monitors we have today. So where is the promised HDR? It’s there, in small bits and pieces. Let’s talk about the black levels too, because they have improved as well.

Black levels

A typical HDR monitor has a black level of 0.1 nits, which is much better than standard monitors. A high-end HDR monitor can go up to 0.01 nits or even better.

So let’s say you get 0.05 nits of black and 1,000 nits of peak luminance. That’s a total of 14 stops if you pay $5,000 and up. On mid-range HDR monitors, you get 0.1 nits of black and 500 nits of peak luminance, which is 12 stops of dynamic range.

So the natural conclusion to be drawn is that it’s not bad, is it? You’re still getting 12 stops on average, right?

Well, this is where we talk about the ugly shadow problem.

Rec. 2100

The ambient light hitting the monitor really decides how much you can see into the shadows. To understand this let me introduce you to Rec. 2100, the base HDR standard.

It defines a peak luminance of 1,000 nits or greater. The black level should be 0.005 nits or less.

Doing the math, the number of stops between 0.005 nits and 1,000 nits is about 17.5 stops. That sounds awesome, except most of the HDR is geared towards the shadow region than the highlights. The increase from Rec. 709 as far as highlights are concerned is only 3 stops. The remaining advantage it has, of over 7.5 stops, is all in the shadows.

This is the standard most high-end HDR panels are trying to reach. The standard also specifies what the ambient levels should be so we can take advantage of this awesome shadow detail. According to Rec. 2100, the surrounding light should be about 5 nits to actually see these details in the shadows.

The problem of the shadows

To put things into perspective, if you want to roughly convert nits to lux, as I’ve written about earlier, just multiply it by Pi, or 3.14. 5 nits is about 16 lux. Now this is just a convenient translation for reference, the two are not really related. Netflix recommends an ambient level of only 10 lux. A living room with only romantic lights turned on gives about 50 to 100 lux. A workspace should ideally be more than 500 lux, and a typical studio setting is 1,000 lux. Outdoors it’s a whole lot more. So, the only way to really see anything in the shadows is if you view it in a completely dark environment.

Think about the one lamp you have in your living room, or the small amount of window light that creeps into your room even when you have the curtains drawn. This small light still delivers about 50 lux, which is the rough equivalent of 15 nits.

Or let’s say you manage to completely eliminate ambient light, but your television or monitor is 10 feet away from the wall behind you, which is painted white or beige. Remember, your monitor is just like an LED light source, and the typical mid range 40” standard monitor gets back about 5 lux bounced from the wall. This monitor only outputs 100 nits. An HDR TV that outputs 1,000 nits is 3 stops brighter, so would give about 40 lux. One that only outputs 500 nits will give us about 20 lux. Both these are above the Rec. 2100 limit of 16 lux. This means, you’ll be obliterating important shadow detail just by having a wall 10 feet away.

Either way, let’s assume you can get it down to 50 lux under normal viewing conditions. That’s about 15 nits. The difference from 0.005 nits to 15 nits is 8 stops. So whatever 7.5 advantage you get in the shadows, you will not be able to see the difference when compared to a normal monitor. The same problems affect standard dynamic range monitors as well, but their black levels are much higher.

If the black levels are manipulated at the mastering stage to accommodate this ambient light problem you introduce two other problems. If you lower the levels even more you will get crushed blacks with more ambient light. If you raise black levels it will appear more washed out instead of deep rich black, with lower ambient levels. Similar to what you see on monitors exposed to a higher amount of ambient light.

Normal
Crushed blacks
Washed out blacks

You cannot control ambient light, so the best colorists can do is to master the content according to the standard, and hope the consumer is living in a studio-like environment. Yeah, right!

How HDR handles too many technologies, products and expectations

Each TV manufacturer puts their own secret sauce in the TV, and they produce multiple monitors every year, some with different technologies as well. This means you need a program or algorithm that will convert an HDR signal mastered in one standard and map that to the specific monitor on hand. This algorithm is, in simple terms, called the transfer function.

To the layperson, it’s a math formula. This is more complicated that it sounds, because not only does the transfer function have to cater to HDR displays, but it also needs to figure out how the HDR signal must behave if the consumer decides to play it back on a standard monitor.

The most important transfer function we need to know of is EOTF, or Electro-Optical Transfer Function. The job of this transfer function is to convert the digital HDR signal to a visible image we can enjoy. It has to maintain the artistic integrity of the original video, and it has to find ways to avoid introducing artifacts like banding, noise, etc., when it plays around with luma levels and color. This is why HDR signals need more space, I’m talking about math space here, to allow the EOTF to do its thing. For this reason, you’ll find that HDR signals have to be 10-bit or more. 8-bit signals are more than enough for any SDR imagery, but when you need to perform image manipulations in HDR you need 10-bits or more.

The two most important transfer functions

There are different EOTFs you can use depending on the end result you desire, and there’s no one size fits all solution. The two most widely used today are:

  • PQ, or perceptual quantizer, and
  • HLG, or hybrid log-gamma.

Both are part of the Rec. 2100 standard. They attack the same problem in two different ways. PQ is not recommended for real-time broadcast. For live and broadcast applications, HLG is recommended.

PQ assumes the standard reference white is still at about 100 nits, and whatever is above is for specular highlights or other highlight details only. This means if you watch in on a low dynamic range monitor, then it will just clip the values you won’t see anyway.

On the other hand, HLG doesn’t have a nit value for its white point. It defines it as 50% of peak luminance. HLG is designed to be used for monitors with less than 5,000 nits peak luminance, while PQ assumes an ideal 10,000 nits monitor.

PQ is ratified under a standard called SMPTE ST-2084. Dolby uses PQ in its Dolby Vision standard, but insists on using 12-bits when everyone else is satisfied with 10. Those who use PQ with 10-bits form a second group, under the Consumer Electronics Association, the format commonly known as HDR10 Media Profile. The 10 stands for 10-bits.

The three front runner formats for HDR

So we have three formats currently in the lead:

  • HDR10, which is PQ with 10-bits,
  • Dolby Vision, which is PQ with 12-bits, and
  • HLG, which is 10-bits as well, so you could call it HLG10. HLG is standardized under ARIB STD-B67.

So for a content producer you have to wonder, should you be mastering in all three formats? The answer is maybe.

Blu-ray HDR and Netflix supports both HDR10 and Dolby Vision, in other words, only PQ. YouTube supports both HLG and PQ functions. The BBC and NHK stick to HLG for broadcast applications. HDR10 and HLG are open standards, so you don’t pay royalties. With Dolby Vision you do. The other problem with Dolby Vision is there are no 12-bit consumer television or monitors, so you will never be able to see true Dolby Vision. Even the best Dolby reference monitors max out at 5,000 nits, though mostly everyone believes 10,000 nits to be a good target to aim for.

How can ACES help?

One of the promises of ACES is that it gives you a 33 stop playground to incorporate any HDR workflow. The ACES color space also incorporates the entire visual color gamut, so any color space issues from Rec. 709 to Rec. 2020 to DCI P3 are not a problem.

ACES gives you out of the box transforms, what they call ODTs or Output Device Transforms for different dynamic ranges like 100, 600, 1,000, 2,000 and 4,000 nits, depending on how you’d like to master HDR. So once you have a graded ACES master, it should be a relatively painless step to create different HDR deliverables from one platform. The color space transformations are also handled by these ODTs.

The biggest advantage of using ACES at this point in time is we don’t know which way the future will swing with HDR. Better displays will come, and rather than start from scratch it would be a relatively minor affair to create one more deliverable. If a new standard emerges an ODT will likely be available for it.

Is HDR worth it?

Whichever way you slice it, shooting HDR is the easy part, cameras do it already. Grading and mastering HDR demands a calibrated environment, an HDR reference display and a tool like DaVinci Resolve to get it done. This is an additional investment for any professional or small post house. You really have to be ready to handle any client demand if you’re in this business. The fractured nature of the format wars makes this a practical necessity.

Where HDR fails for me, though, is in the consumer display realm. Don’t get me wrong, HDR is the future and I’m really excited for it. Just that the displays you are being sold today with dreams of HDR are well short of the value they deliver for the price they demand. In other words, buying an HDR TV today is not worth it unless you have the disposable income. This is my personal opinion.

And don’t be fooled, a lot of what is touted as HDR is simply improvements in color gamuts, but even standard displays nowadays provide well close to P3 or Adobe RGB, so these are not things specific to HDR. The panels are getting better as well, and are more power efficient but on the other hand will end up consuming the same or more power to deliver more brightness.

Lastly, let me address one more issue with HDR – our eyes.

Our eyes are HDR, but not HDR TV-compatible!

Our eyes still work differently – rods and cones and all that. The eye must adjust for night and day, according to different changes in light. No HDR transform can replicate that perception.

Certain peak levels might cause a flickering effect in our eye because it responds to different frequencies differently in different regions of the retina. This is directly related to the size of the display and our distance to it. Also, the speed with which our eyes adjust to brightness changes is slower than a cut on screen. Even if a camera can shoot from dark to the brightest, and even if a display can show that, our eyes may not be able to take in the entire visual information in one go, or it might need more time than the shot currently allows. This means you might need to edit differently for HDR material. This problem isn’t very pronounced today because we don’t have true HDR yet, but it will come.

Standard transitions and effects like dissolves, fades, flares, etc., will look different in HDR. That blue anamorphic flare you love so much might end up being an annoying distraction. Let’s wait to see if it does.

To wrap this up, as a filmmaker and cinematographer, shoot as you do, and let others worry about it. You still can’t see what your cameras can shoot. As a post production colorist or mastering facility, can you afford to move to an ACES workflow? Are your clients going to appreciate in monetary terms the additional cost, training and workflow that HDR entails? If no, then it’s not worth it.

As for the consumer, the advantage of HDR monitors over current standard displays is only one or two stops at best, even disregarding the peak luminance window problem. Are you willing to pay an order of magnitude more money to see one or two stops of improvement? That’s your call.

I would say, based on my own calculations, that when you get 10,000 nits on HDR monitors, it’s time to get excited.

I’ve been studying HDR content on HDR TVs in multiple locations for many months now. And to be honest, if you don’t tell somebody its HDR, they won’t know. Walk into an electronics store and see if you can figure out which TVs are HDR without reading the labels. Then look at the price. Finally, don’t forget to factor in HDCP as well. Without the right graphics card or device, cable and monitor you won’t even be able to see a lot of 4K HDR content. And the sad part is it’ll all change in just a few years.

I hope you’ve found this article useful. If you have any questions, please feel free to ask me.

6 replies on “What is HDR and is HDR worth it?”

Very expository. Thanks. Please I would like you to review 4K Monitors in laptops in relation to resoultion and HDR.

Thanks Sareesh,
Very interesting and very well presented!
And yes, an easy to find date-stamp is a great suggestion for all tech bloggers :)

It would be very helpful if you included a Publication Date/Last Revised Date at the top of all of your articles (including those already posted). When you write (and I read) “buying an HDR TV today is not worth it unless you have the disposable income” it would be helpful to know when “today” is. Is it current? Is it a year old and no longer applicable? And yes I know that this is a new piece but for the person who runs across it a year from now…

Well done, very enjoyable and informative.. great voice, authoritative yet comfortable too, and i love the fact that you have supplied the factual text too!
Hey Big Thanks

RT..

Comments are closed.