Complete guide: what is the difference between hdr types?

Complete Guide: What's the Difference Between HDR Types?

bruno martinez avatar
In our definitive guide on the subject, discover the difference between the existing types of HDR and choose the one that best fits your content.

Gone are the days when resolution was the most important attribute of a screen: whether on smartphones, laptops, tablets or TVs, what determines the quality of a display nowadays is the ability to reproduce colors – point at which the standard in HDR technology used (HDR, HDR10, HDR10+, HLG, Dolby Vision or Advanced HDR), becomes decisive. But what is the difference between the HDR types existing and which is the best?

Year after year, the top smartphones released boast that they have screens with the latest HDR technology. In 2020, the line iPhone 12 took a step forward and became the first capable of reproducing, recording and editing content in Dolby Vision, a specific type of HDR. To understand the importance of this technology for the photos we take and the movies we watch on our cell phones – but not just on them – we need to understand what is HDR and how each standard present in the market is different. Come on?

What is HDR?

Comparison between hdr and sdr to show the difference between the types of hdr, starting with the explanation of what the technology is.
A good HDR (right) allows you to differentiate the subtle variations of contrast in a scene (image: Playback/AWS)

Acronym for High Dynamic Range, or 'High Dynamic Range', in Portuguese, HDR, in its simplest form, consists of a series of techniques used in capturing and displaying images. They seek to make cameras and screens with technology capable of recording, and displaying, content with a contrast variation bigger and bigger. When we see a deep black next to a very dark gray, it is the contrast variation, that is, the difference in light and shadow from one to the other, that allows us to differentiate them.

The function of HDR is to make this variation more and more noticeable and, for that, it makes cameras and screens more sensitive differences in contrast, allowing them to capture and reproduce a greater number of tones between the black and white absolutes.

In a nutshell, the goal of HDR is to make the colors and contrast of a digital image as accurate as a digital image. real image, seen without the intermediation of screens or cameras. This in itself poses a huge challenge, as our eyes evolved over millions of years to what they are today.

How does HDR work on cameras?

An image of a stone house showing how the hdr construction of the photo takes place.
In cameras, HDR arises from superimposing multiple images with different exposures (Image: Farbspiel Photography

As we saw above, the HDR image is present in cameras e fabrics. With that, there is no point in having content recorded in HDR and displaying it on a screen with SDR – Standard Dynamic Range or Standard Dynamic Range, in Portuguese. Likewise, your HDR TV will not be able to convert SDR content to HDR. While this is possible, as some French researchers reported in 2017, it is something that requires the conversion of the content with the use of artificial intelligence, and it still can't be done by our TVs or smartphones.

Understanding that HDR, to work, must be present both in the camera that captures the content and on the screen that displays it, it becomes easier to understand how it works in each of the two. In cameras, the simplest HDR technique is to take 3 pictures at the same time, one with high light sensitivity, one medium and one with low light sensitivity. Then post-processing software, which varies between each camera model, overlap the three images and create a single photo.

In this photo generated after the superimposition, the variation between the lightest and darkest point is greater – there is, then, a enlargement of that camera's original dynamic range. To understand how HDR imaging applies to videos, just imagine the camera doing this job with each frame (or frame) of the video. This is why many smartphone cameras, for example, cannot record at high resolutions and with HDR – it ends up being a lot of information for your sensors and processors to handle.

Contrast comparison using a photo of a lake with forest in the background.
Although it only seems to interfere with the light and shadow of the images, the increase in contrast also intensifies the color. Look at the photo (Image: Wikipedia)

As stated above, the result of the overlay will depend on the post-processing software used in the camera. If he doesn't know how to do a good job, the image will come out weird, with a wide contrast variation, but it doesn't correspond to reality. Also, the HDR should not simply produce stronger colors: the image should be more vivid not because it is more saturated, but because the difference in tones of each color is more noticeable and, consequently, more faithful to reality.

When we talk about contrast, it is common to think only of black and white. However, contrast also reflects on colors, since all color is composed of light and shadow, and the proportion between each of these (light and shadow) will produce a different tone of that color. In both cameras and screens, HDR seeks to make the differences in tone in each color more noticeable, even if they are subtle. the HDR never should make the colors gaudy.

Presentation of iphone xs with smart hdr, another model of hdr shown through several photos in sequence on a black background.
As it advances, HDR technology mixes with artificial intelligence, seeking even more precision (Image: Reproduction / Apple)

Once you understand that, let's get back to overlaying images: how the images are overlaid depends on the camera manufacturer. That's why we see so much difference in HDR from Google Pixel in relation to iPhone, for example. Currently, most manufacturers use artificial intelligence to improve the overlay of images, as well as to retouch them even after being overlaid. It is part of the so-called computational photography, which seeks to circumvent the technical limitations of smartphone cameras with a dose of intelligence.

In its way of working, HDR is a type of cake, whose recipe each manufacturer makes in its own way and, like cakes, it is good to understand the difference between the types of HDR to choose the one you like the most. Google, for example, improves the HDR of smartphones in the range pixel using advanced AI techniques and superimposing not just 3 images, but more than 10. The amount of images that will be superimposed will also is not fixed, depends on how many the AI ​​deems necessary.

This will depend, among other factors, on the light conditions and even the movement of the camera: if you shake your arm a lot, the AI ​​understands that you can't take as many photos, or everything will come out. erased. This is a complex calculation, isn't it? Also, this fragmentation does not help the consumer to understand the difference between the types of HDR.

And how does HDR work on TVs?

Demonstration of Dolby Vision technology compared to SDR technology using the photo of a scarlet macaw.
The difference from SDR to HDR is the amount of information to be displayed on the screen, resulting in richer images (Image: Dolby Labs)

Now that we know how HDR works in cameras and that it varies according to each manufacturer decides to use it, we are closer to answering our main question: what is the difference between the existing types of HDR? Not only, but especially when we talk about TVs, there are a lot of acronyms (HDR, HDR10, HDR10+, HLG, Dolby Vision, HDR Advanced and others) to confuse us. How do they differ and which of these 'HDRs' could be considered the best?

Well, once you understand how HDR works in cameras, it's easy to conclude that a consequence of overlapping images is an increase in the amount of information (measured in metadata) present in the photo or video. Basically, HDR on TV 'brightens' the screen, pardon the pun, which are the brightest and darkest points in the picture, as well as better detailing the variation between them. Consequently, the screen will only do a good job of representing all this additional information. with accuracy.

Candle under a saucer illuminating the environment, representative of the amount of light emitted by a candle in a space of one square meter.
In luminometry, 1 'nit' is equivalent to 1cd/m², that is, the light emitted by a candle in a space of 1 m² (Image: Reproduction/Metrologia)

In that sense, a screen common HDR it is one that, due to its technical specifications, is able to display a wide spectrum of light and dark tones, both in black and white and in color. For metric purposes, this variation is measured in nits, an unofficial unit of measurement (since it is not present in the International System of Units) but which is widely used in technology and which is equivalent to 1 candela per square meter (1cd/m²). “nit” is used instead of “1cd/m²” because it has a clearly (haha) more commercial name.

Basically, the more nits a TV has, the more brightness it can emit. In general, a greater number of nits also means a greater contrast ability, since, by increasing the maximum amount of light that is emitted, the wider the spectrum of tones between total brightness and darkness will be. In short, the higher its maximum brightness capability, the more efficient the TV becomes at differentiating tones. intermediaries, achieving the accuracy needed to display HDR content on TV.

Influence of light measured in nits, being 2000 nits in the sky, 500 directly in the sun, 000 on the dark part of the ground, 30 in the shadow of the canoe and 300 in the region where the light hits the canoe.
Os nits They are also used to measure the incidence of light in real environments, such as the one in the photo (Image: Reproduction/Dock10)

In January 2016, an alliance formed by several screen manufacturers, called the Ultra HD Alliance, established two standards to be able to say that a TV had HDR technology: in the first one it must vary between 1000 e 0,05 nits brightness, and in the second it must vary between 540 e 0,0005 nits of shine. With this, every TV that achieves these metrics can bear the 'basic' HDR label, even if this does not result in an improvement in the final image.

As you can imagine, the lack of broader standardization made HDR, in TVs, a feature whose real advantages were not always present. It was enough for me to have a TV with a brightness capacity above 1.000 nits that I could call it HDR and compete with screens that, in addition, added other technologies to be the best on the market, correct? Well, if they hadn't invented the different HDR standards that cause us so much confusion, this would have happened a lot more often.

The differences between hdr10 and hdr10+, two models of high quality imaging technology.
The different HDR standards correspond to the different technologies added to it, which provide even better images (Image: Playback/Internet)

In addition to confusing you, each HDR standard sets other technologies that must be present on the TV that wants to bear its name. With this, the manufacturers try to prevent the HDR of TVs from becoming the mess that became the HDR of cameras: they want to ensure that, while each manufacturer can improve their screens with their own technologies, the HDR standard used establishes a minimal quality as far as the overall picture is concerned, not just the contrast.

Therefore, different HDR standards are a certification, a promise that, by carrying that seal (HDR10, HLG, Dolby…), the TV passed rigorous laboratory tests and met several pre-established requirements. These requirements, which vary by standard, are not limited to maximum or minimum brightness values ​​– in fact, improving the perception of color and light in TV images.

So what's the difference between the types of HDR?

After almost becoming an HDR expert, you will finally understand the difference between the types of HDR out there. Since we have already explained above how the technology works in its simplest form, we will show each of its “evolutions”, going from the simplest to the most advanced, explaining what its advantages and disadvantages are.

HLG

Comparison of sdr and hlg on two televisions showing a football game. On the right, hdr, the image quality is sharper.
Similar to regular HDR, HLG is designed for live broadcasts (Image: Playback/FlatPanelsHD

Conceived by the BBC and NHK broadcasters, the HLG format meets the interests of these two companies in providing the best possible image, considering that, in a TV broadcast, there are viewers with HDR and SDR screens. With this, HLG is not limited to TVs with high dynamic range, and allows the transmission of live content to both audiences (with and without HDR TVs). This division is important because, in specific cases, displaying HDR content on an SDR TV can do so. look worse, with washed out and even distorted colors.

The great advantage of HLG is that it is an HDR format backward compatible with SDR TVs. In broadcast TV this is essential, however the standard is not very popular outside the UK and Japan, countries where it was created. In Brazil, open TV broadcasting is fully SDR. In streaming content, on platforms such as Netflix and YouTube, it is possible to identify which technologies the TV supports, ensuring that the content is 100% compatible with it and eliminating the risk of the image being worsened.

Because of this, it is very unlikely that HLG will become popular one day, as it is an intermediate solution designed for broadcast TV, which usually takes years to adopt new technologies. Most likely, free-to-air TV will only broadcast in HDR when the majority of TVs in its audience support this feature.

HDR10 and HDR10+

https://youtu.be/HWjvpMW6tZ0

Fruit of partnerships between Sony e Samsung, HDR10 combines the high contrast capability of HDR with 10-bit color depth. This color depth is considerably greater than the standard 8-bit, meaning that an HDR10 screen, compared to a 'ordinary' HDR screen, is capable of displaying colors that the latter cannot.

Already HDR10 +, presented by Samsung e Panasonic, enhances HDR10. For this, the maximum brightness capacity goes up from 1.000 to 4.000 nits, in addition to the metadata present in the image being dynamic. Remember when we said that HDR consists of inserting more information (metadata) into images? Well, this metadata tells the TV how to display the content, they tell you what is the maximum brightness and minimum brightness to be represented on the screen.

When this metadata is static, same as in HDR10, the maximum and minimum values ​​are the same in every frame, even if the content is a movie that is hours long. As a result, accuracy in content reproduction suffers, as metadata is generated based on the entire movie, which can have darker scenes and lighter scenes.

In the case of HDR10+, whose metadata is dynamic, this variation changes according to each scene displayed. Each frame has its own metadata, generated using a technique called “tone mapping”. As the name says, this technology maps the frames, checking the darkest and lightest parts of each frame, generating the metadata. As a result, HDR10+ has saturation and brightness levels always suitable to the displayed image, getting closer to the quality of its rival, Dolby Vision.

HDR Advanced

Editing island placed in a kind of curtain control in the theater to emulate the show that would be the advanced hdr made by technicolor.
With the promise of being a 3-in-XNUMX, HDR Advanced is the least popular of the standards out there today (Image: Technicolor)

The most unknown HDR standard is developed by Technicolor, a company that has been working in photographic coloring techniques since the roll film era. Although it has only one name, the HDR Advanced encompasses three sub-standards of HDR, each having its own purpose:

  • The first of them, the SL-HDR1, stands out for being a 'basic' HDR and 100% compatible with SDR displays.
  • The second pattern, called SL-HDR2, closely resembles HDR10+ and Dolby Vision standards, as it also has dynamic metadata.
  • The third and final pattern, still in the testing phase, tries to unite the two. If it works, it will be the first HDR format with dynamic metadata to be backwards compatible with SDR TVs. (ctlsites.uga.edu)

The goal is not to leave an SDR TV with HDR10+ quality, but to avoid the distortions we mentioned earlier. Considering this, many people question the usefulness of adding two technologies that are antagonistic (HDR and SDR). In streaming, it is already possible to transmit HDR content only to compatible TVs, while in the case of open TV stations, it is very likely that they will only start producing HDR content when most of the audience has TVs of this type.

For now, HDR is simply not interesting for TV broadcasters, as it is expensive to produce content with the technology, and most people around the world still have SDR TVs. Precisely for this reason, many believe that HDR Advanced is doomed to fail along with its brother, the HLG.

Dolby Vision

Seen as the 'gold standard' of HDR standards, the Dolby Vision it is present in the most expensive TVs and it can be said that it is the best of those currently available. As usual, such quality comes from a high standard demanded by Dolby Labs: while two HDR10+ screens can display slightly discrepant images, depending on their specifications, projectors and TVs certified with Dolby Vision undergo tests to ensure that despite equipment peculiarities, the images are displayed exactly according to Dolby standards.

In addition to all this testing, the Dolby Vision has more sophisticated technical requirements. The maximum value of nits supported, for example, is 12.000. Although no TV today comes close to this level – the most expensive ones peak at 2.000 nits brightness – this shows how the pattern was also designed for the future. Dolby Vision was also the first standard to have tone mapping and dynamic metadata, the nature of which we explain in the HDR10+ thread.

As if that weren't enough, Dolby's technology also offers 12-bit colors, that is, screens with this certification display some colors that even TVs with HDR10+ cannot display. The bad part of this technology is the price: as the format is from Dolby, the company demands royalties from the companies that use it, which makes any compatible device more expensive. This factor also makes movies and series with technology more expensive.

While the Netflix already have some titles with Dolby Vision e HDR10, only Amazon Prime Video has titles in HDR10 +. Also, when playing Blu-Ray content, it is necessary that the player device is also certified by Dolby, which is not the case for most players of this type of media (consoles). In Brazil, only LG owns the rights to use Dolby Vision technology in TVs, while HDR10+ is restricted to TVs from Samsung and Panasonic, which developed it to compete with Dolby.

* Dolby Atmos: often bundled with Dolby Vision, the Dolby Atmos is sound pattern certified by Dolby Labs, commonly present in TVs and smartphones of different brands. As it deals with audio and not video, Atmos is not part of the scope of this article, however, we have included it here so you don't get confused when you see its name around. So, do you understand the difference between the types of HDR?

With information from: CNET, AVForums, JMGO, Digital Reliance, Samsung, HowToGeek


Discover more about Showmetech

Sign up to receive our latest news via email.

Related Posts