When Nits are Not Enough: Quantifying Brightness in Wide Gamut Displays

Finding Better Ways to Compare Perceived Brightness 

As I covered in my last post, Better Together: Combing HDR and Quantum Dots, manufacturers are eager to transition their products to both high dynamic range (HDR) and wide color gamut (WCG) to achieve the optimum level of color and visual performance.  With the accelerating adoption of WCG display technologies, the challenge of finding a “fair” brightness metric to compare displays is becoming increasingly important; even more so as the HDR trend pushes nit counts ever higher.

At this year’s CES (the largest consumer electronics show in the world), the combination of HDR and WCGs was arguably one of the hottest topics at the show, with nearly every major manufacturer demonstrating HDR/WCG in their next generation displays.

HDR typically requires higher peak luminance levels, which means higher maximum nit counts.  Samsung was touting 1000 nit displays in their booth, while Sony had a 4000 nit technology demonstration.  Obviously, manufacturers want consumers to associate higher nit counts with higher brightness and better image quality.

Whether this is actually the case will depend on the color performance of the display, specifically on its spectral profile.  Why is that?  Two reasons.  First, there is a fundamental tradeoff between nits and color gamut – the wider the gamut the lower the nit level.  Second, wider color gamuts result in higher perceived brightness levels.  With these two effects in play, you can start to imagine why it becomes difficult to get a true “apples-to-apples” comparison on display brightness.

The “Luminance = Brightness” MythFig1

If you’ve been shopping for a TV or computer monitor lately, you’ve likely came across the word “nits” prominently listed in the display’s technical specifications for brightness. Some time ago display manufacturers began using nits, which is actually a measure of luminance, as a way to market the brightness levels of their products.

What’s the difference?  In color science, any term ending in “–ness” (e.g. colorfulness, lightness, brightness, etc.) refers to a subjective attribute of light – in other words, properties of light that cannot be directly measured and can only be described as sensations by the observer.

Bright“-ness” is defined as the subjective sensation in the consciousness of a human observer when light falls on the retina.  We use words like “dim” or “very bright” to describe this non-linear response to light.  [Note: Do not confuse this concept with the “brightness” setting on most displays, which controls the black level of the display].

Luminance, on the other hand, is the closest correlate to brightness that can be repeatably measured (in the form of nits, or candelas per square meter).  It represents the amount of light that is emitted or reflected from a surface.

One simple way to understand the difference between brightness and luminance is to consider their logarithmic relationship to one another – the sensitivity of the eye decreases rapidly as the luminance of the source increases (see the graphic to the right).

In other words, if the luminance of a source increases 10X, viewers do not perceive a 10X increase in brightness.  It is this characteristic that allows the human eye to operate over such an extremely wide range of light levels (14 stops for you photographers out there).

Many professionals in the display community continue to use “brightness” interchangeably with “luminance” because many people don’t know what “luminance” is.  Thus, display manufacturers who want to make it easier for buyers to compare displays to fit their brightness preferences, are only cementing the confusion.

Why does this even matter?  Well, it didn’t when all of our displays used the same backlight technology and provided more or less the same level of color performance (Rec. 709 or sRGB).  However, as wide color gamut (WCG) displays proliferate, the picture quickly becomes more complex.

Quantum dots, RG phosphors, OLED, lasers, RGB LEDs and other technologies are becoming more common, and with this diversification in backlight technology, the interchangeability between luminance and brightness starts to break down.  Fundamentally, this all comes down to the spectral profile of the RGB primaries used to make the full color display.

Nit-picking: Understanding Photopic Response Overlap and the H-K Effect

One of thefig2 key features that enables WCG displays, and in particular quantum dot displays, to produce a wider spectrum of colors is their narrow emission spectrum of the red, green, and blue (RGB) display primaries.

Compared to the phosphor-based approach used in most standard LCDs, quantum dots produce extremely narrow spectral bands of light.  Due to the spectral purity of the emitted wavelengths (typically 20-30 nanometers in quantum dot displays), RGB primaries are perceived as having a much higher level of saturation.

These highly saturated RGB primaries can be plotted on a CIE gamut diagram as the vertices of a triangle, where every color within the triangle can be recreated by the display (see image to the left).

The horseshoe shape represents all of the colors our eyes can perceive, with the edges of the horseshoe representing monochromatic wavelengths of light.  Therefore, the narrower the spectral profile of emission, the more saturated the color of the display primary, and the closer to the edge of the horseshoe the RGB primaries lie.

This clearly results in larger and larger triangles that cover more and more of the perceptual color space.  Notice how even with the widest TV color gamut standard, Rec. 2020, the three primaries only cover 63% of all perceived colors.  In other words, there is plenty of room to improve our displays from a color performance perspective.

Another way to evaluate the color performance of a display is to examine their spectral distributions.  In the diagrams below, the red lines represent the spectral distributions from two different types of display: WLED  vs. quantum dots.  As drawn, the spectral profiles of both technologies are quite different with the quantum dot display showing three distinct RGB peaks (thanks to the narrow RGB primaries) versus the blue LED peak and broad yellow emission of the WLED.

fig5

The WLED display on the left is reproducing 75% of the NTSC gamut.  The WCG display on the right is reproducing 100% of the NTSC gamut. Again, the wider gamut on the right is a product of the spectrally narrow RGB primaries.  The blue lines in the diagram represent something called the photopic response, which is the way our eyes perceive and naturally emphasize certain wavelengths across the visible spectrum.  So, display wavelengths closer to the peak of the photopic response curve (555 nm) contribute more to the calculation of the nit level than wavelengths at the tails.

Because of the broader spectrum of light from the white LEDs, the WLED spectrum overlaps the photopic response peak to a greater degree than that of the quantum dot display, resulting in higher nit counts.  Conversely, the more well-defined quantum dot primaries, with less photopic response overlap, results in a lower nit count.

This difference in nits due to spectral profile variance happens despite the fact that the power delivered by both displays is the same (i.e. the area under the curves is identical).  Said another way, the color gamut of the quantum dot display with the same power output as a 400 nit WLED display would result in a 336 nit display, a drop in luminance of 16%.

Of course, the degree of the nit penalty for the WCG display depends entirely on the spectral distributions being compared.  The example above clearly demonstrates the tradeoff between color gamut and luminance as well as the difficulty in using nits to evaluate standard gamut and WCG displays.  If the displays in question have different spectral distributions, then the comparison between the two is no longer “apples-to-apples”.

 

But wait, there’s more …

Even though the quantum dot spectrum above measures a 16% lower nit count relative to the WLED display, the perceived drop in brightness is not as drastic as the logarithmic relationship between brightness and luminance would predict.

Put another way, the more saturated color reproduction results in the WCG display being perceived brighter than the luminance would suggest.  Why is this the case?  Well, there is a psychovisual phenomenon known as the Helmholtz–Kohlrausch effect (or, simply, the “H-K effect”) which explains that saturated colors appear brighter to our eyes than less saturated colors of equivalent luminance.  I’ll delve further into this interesting phenomenon in my next post.

Until then, the important takeaways from this post are:

  • Luminance (nits) is not the same as brightness, especially when comparing standard color gamut and WCG displays
  • WCG displays require narrow RGB primaries, resulting in reduced photopic response overlap (lower nit count)
  • Narrow RGB primaries appear brighter due to the H-K effect

 

All of this makes finding an alternative specification for fair brightness comparisons one of the more important challenges to solve in the design and marketing of WCG display technology.  Without it, display engineers and consumers alike will get left in the dark about the full performance benefits of WCG displays.  What’s needed is a revised photopic response curve that takes into account the effects I’ve outlined in this post.  At QD Vision, we’re already developing this new metric and are looking forward to sharing it with the display industry in the near future.  I’ll be talking about that in a future post, so stay tuned.

 

By John Ho, Product Marketing Manager, QD Vision

Better Together: Combining HDR and Quantum Dots

I recently had the pleasure of attending the Society of Motion Picture and Television Engineers (SMPTE) annual conference, held in downtown Hollywood adjacent to the historic TCL (formerly Grauman’s) Chinese Theater.

I’ve paid close attention to this event for many years, as it always produces valuable content and great thinking, but this year was especially significant. As usual, hundreds of professionals from the film and broadcast industries gathered to discuss research findings, show off the latest technology innovations, and ultimately to push for broader standardization within the industry.

Amidst it all, three important themes emerged:

  1. Ultra High Definition (UHD) cinema and TV are on the cusp of a historic transition to more immersive and realistic content, above and beyond what we’ve seen so far with 4K resolution.
  1. This transition is largely being driven and enabled by two technologies that are usually discussed separately, but that are increasingly interconnected and complimentary: High Dynamic Range (HDR) and Wide Color Gamuts (WCG).
  1. Display manufacturers are not waiting for the standards to get sorted out, meaning there will be many flavors of HDR in the marketplace, with the versions that combine WCG (i.e. quantum dots) showing the best overall improvement in visual quality.

Beyond 4K

Since the inception of TV in the 1940s, major cycles of innovation have pretty much looked the same.  The introduction of new technologies drives production of new, more advanced content, which then drives demand for the most advanced TVs and monitors, and so on. The majority of these cycles have focused on increasing screen resolution and size, rather than dramatic improvements in color and visual performance, as illustrated by the timeline below.

BlogColor has tended to take a back seat to resolution and size increases.  For example, when the industry shifted from analog to digital broadcasting in the early 2000s, moving from NTSC to the Rec.709 standard, color performance actually decreased considerably.

We’re in one of these major innovation cycles now, moving from HD to UHD, but this cycle is shaping up to be to quite different than the many that came before.  Not only are we making a significant move up in resolution, from 1080p to 4K, we’re also taking an unprecedented leap forward in color with WCGs and visual performance with HDR (not to mention other emerging technologies such as high frame rates).

You’ve likely been noticing a lot of buzz involving about these two technologies lately, and for good reason.

Much of the HDR demonstrations I saw at SMPTE were better than anything else I’ve seen commercially available to date.  There was even a 94% BT. 2020 monitor using quantum dots showing colors not accessible on even the best OLED displays.

Demo content had dramatically deeper blacks and a color richness that, when combined with 4K resolutions, created an immersive experience that needs to be experienced to be truly appreciated — at times I felt like I was actually watching real-life happening through a window.

But the biggest discussions at SMPTE didn’t involve HDR’s picture quality potential. That’s already widely acknowledged. The real question on everyone’s mind is about finding the best path to get there.

The Luminance Question

At the center of the debate is the necessary level of peak luminance and its impact on the Electro Optical Transfer Function (EOTF).  There is a great article that goes in to more background on EOTF here.

Luminance is measured in candelas per square meter (cd/m2), commonly known as nits, and in nature reflected luminance varies considerably across the different areas within a person’s field of vision.  Some areas in shadow or darkness may have an extremely low luminance of less than one nit, while other areas in full sunlight may have a nit count of 15,000 or more.

Today’s reference monitors are calibrated to a maximum luminance of 100 nits and minimum luminance of 0.117 nits, significantly limiting their ability to accurately reproduce what we see naturally in real life. HDR aims to solve the problem of natural visual reproduction by introducing a luminance range that is more closely aligned with the natural world.  But, in order to increase nit count, brighter LED backlights are needed, which increases the overall power requirements of the system.

So, the real question becomes, how can display manufacturers reach the significantly higher nit counts required for HDR, while still meeting stringent energy-efficiency standards such as Energy Star in the U.S. and E.E.I. in China? Answering that question depends not only on how much luminance display manufacturers can squeeze out of the backlight, but also on which HDR standards become adopted moving forward.

Competing HDR Standards

The two most prominent HDR standards are focusing on the EOTF.

One standard has been developed by Dolby as part of their Dolby Vision platform.  This standard has also been formalized in SMPTE standard 2084.  SMPTE 2084 enables a peak luminance of 10,000 nits and uses a piggybacked metadata stream on all broadcasts.  The metadata provides information on how the respective content was originally mastered, which enables a compliant display to automatically adjust settings to optimally display the content. Because of this metadata stream, 2084 content is not backwards compatible, so it cannot be played on older TVs.

The competing standard, Hybrid Log Gamma (HLG), is being developed and backed by the BBC and NHK. HLG enables 17.6 stops of dynamic range on a 2000 peak nit display with a black level of 0.01 nits and is an evolution of current standards that adds a log function to the top half of a traditional camera EOTF curve. HLG is therefore backwards compatible, and HLG content can be played on both standard dynamic range or HDR TVs.

HLG appears to have an edge with broadcasters at this point due to its lower overall complexity and backwards compatibility.  However, it’s most likely that both HDR standards will be implemented in commercial displays. For example, LG announced their 2015 4K TVs would support both HLG and SMPTE 2084.

The Case for Quantum Dots

Whichever HDR standard eventually achieves prominence, one thing remains clear: manufacturers need a cost-effective path to HDR that delivers both amazing visual quality and energy-efficiency, and that’s where quantum dots come in.

In addition to higher luminance ratings, HDR TVs will also require WCG standards to achieve optimal visual quality, and at this point the most likely candidate for this is the BT.2020 (often referred to Rec.2020) introduced in 2012.

Commercial displays using quantum dot solutions — and edge-lit quantum dot solutions in particular — are the most energy-efficient way of achieving maximal BT.2020 coverage.

This is largely due to quantum dots inherent photonic efficiency, which results in nearly every photon of the incident light being down-converted into photons within a narrow spectral range. The result is that quantum dot TVs require significantly less energy for the same color gamut performance. As an example, our Color IQ solution requires between 50-100% less energy than traditional LED TVs at an equivalent level of color.

This gives manufacturers not only a clear path to energy-efficient HDR and wide color gamuts in their future displays, it also helps achieve one of my company’s primary goals: make the best color available to the most people.  No doubt this will be an increasingly hot topic of conversation in SMPTE circles, and with CES around the corner, the combination of HDR and quantum dots may just be the next big innovation story.

 

By John Ho, Product Marketing Manager, QD Vision