HDR vs SDR – Quick & Easy Explanation
Last updated on May 30, 2023
Let’s talk about one of the most exciting topics in display technology: the differences between SDR and HDR. If you are not familiar with these terms, don’t worry, I will explain them in a simple way.
Table of Contents
What is SDR?
SDR stands for Standard Dynamic Range, and it is the most common format for displaying images and videos on screens. It has a limited range of brightness and color values, which means that it cannot show very bright or very dark areas, or very saturated or very subtle colors. SDR is what you see on most TVs, monitors, laptops, smartphones, and other devices.
What is HDR
HDR stands for High Dynamic Range, and it is a newer format that can display a much wider range of brightness and color values. It can show very bright highlights and very dark shadows, as well as very vivid and very nuanced colors. HDR is what you see on some of the latest TVs, monitors, laptops, smartphones, and other devices that support this technology.
What makes SDR and HDR Different?
So why does SDR look different than HDR? The main reason is that SDR and HDR use different color spaces. A color space is a set of rules that define how colors are represented and displayed on a screen. SDR uses a color space called Rec. 709, which was designed in 1990 and covers about 35% of the colors that the human eye can see. HDR uses a color space called Rec. 2020, which was designed in 2012 and covers about 75% of the colors that the human eye can see.
This means that HDR can show many more colors than SDR, especially in the red, green, and blue regions of the spectrum. For example, HDR can show a bright red fire truck or a deep blue sky that SDR cannot. HDR can also show more shades of each color, which makes the image look more realistic and natural.
Another reason why SDR looks different than HDR is that SDR and HDR use different gamma curves. A gamma curve is a function that maps the input signal (the image data) to the output signal (the brightness level) on a screen. SDR uses a gamma curve called BT.1886, which was designed in 2011 and assumes that the screen has a fixed brightness level of 100 nits (a unit of luminance). HDR uses a gamma curve called PQ (Perceptual Quantizer), which was designed in 2014 and can adapt to different brightness levels up to 10,000 nits.
This means that HDR can show much brighter and darker areas than SDR, which makes the image look more dynamic and contrasty. For example, HDR can show a bright sun or a dark cave that SDR cannot. HDR can also show more details in both bright and dark areas, which makes the image look clearer and sharper.
Conclusion
As you can see, SDR and HDR are very different formats that have different advantages and disadvantages. SDR is more compatible and consistent across different devices, but it has limited brightness and color capabilities. HDR is more immersive and realistic, but it requires special hardware and software support.
I hope you enjoyed this article and learned something new about display technology. The next time someone asks you what the difference is between the two, you should easily be able to share what you learned today. Also, it should be apparent why HDR content may not look as good as SDR in some cases since, HDR reflects more natural colors and has the color range to do it. If you have any questions or comments, please feel free to leave them below. Thank you for reading!
Read more: Does HDR look better than SDR?
Responses