Better Together: Combining HDR and Quantum Dots

shutterstock_191977619

I recently had the pleasure of attending the Society of Motion Picture and Television Engineers (SMPTE) annual conference, held in downtown Hollywood adjacent to the historic TCL (formerly Grauman’s) Chinese Theater.

I’ve paid close attention to this event for many years, as it always produces valuable content and great thinking, but this year was especially significant. As usual, hundreds of professionals from the film and broadcast industries gathered to discuss research findings, show off the latest technology innovations, and ultimately to push for broader standardization within the industry.

Amidst it all, three important themes emerged:

  1. Ultra High Definition (UHD) cinema and TV are on the cusp of a historic transition to more immersive and realistic content, above and beyond what we’ve seen so far with 4K resolution.
  1. This transition is largely being driven and enabled by two technologies that are usually discussed separately, but that are increasingly interconnected and complimentary: High Dynamic Range (HDR) and Wide Color Gamuts (WCG).
  1. Display manufacturers are not waiting for the standards to get sorted out, meaning there will be many flavors of HDR in the marketplace, with the versions that combine WCG (i.e. quantum dots) showing the best overall improvement in visual quality.

Beyond 4K

Since the inception of TV in the 1940s, major cycles of innovation have pretty much looked the same.  The introduction of new technologies drives production of new, more advanced content, which then drives demand for the most advanced TVs and monitors, and so on. The majority of these cycles have focused on increasing screen resolution and size, rather than dramatic improvements in color and visual performance, as illustrated by the timeline below.

BlogColor has tended to take a back seat to resolution and size increases.  For example, when the industry shifted from analog to digital broadcasting in the early 2000s, moving from NTSC to the Rec.709 standard, color performance actually decreased considerably.

We’re in one of these major innovation cycles now, moving from HD to UHD, but this cycle is shaping up to be to quite different than the many that came before.  Not only are we making a significant move up in resolution, from 1080p to 4K, we’re also taking an unprecedented leap forward in color with WCGs and visual performance with HDR (not to mention other emerging technologies such as high frame rates).

You’ve likely been noticing a lot of buzz involving about these two technologies lately, and for good reason.

Much of the HDR demonstrations I saw at SMPTE were better than anything else I’ve seen commercially available to date.  There was even a 94% BT. 2020 monitor using quantum dots showing colors not accessible on even the best OLED displays.

Demo content had dramatically deeper blacks and a color richness that, when combined with 4K resolutions, created an immersive experience that needs to be experienced to be truly appreciated — at times I felt like I was actually watching real-life happening through a window.

But the biggest discussions at SMPTE didn’t involve HDR’s picture quality potential. That’s already widely acknowledged. The real question on everyone’s mind is about finding the best path to get there.

The Luminance Question

At the center of the debate is the necessary level of peak luminance and its impact on the Electro Optical Transfer Function (EOTF).  There is a great article that goes in to more background on EOTF here.

Luminance is measured in candelas per square meter (cd/m2), commonly known as nits, and in nature reflected luminance varies considerably across the different areas within a person’s field of vision.  Some areas in shadow or darkness may have an extremely low luminance of less than one nit, while other areas in full sunlight may have a nit count of 15,000 or more.

Today’s reference monitors are calibrated to a maximum luminance of 100 nits and minimum luminance of 0.117 nits, significantly limiting their ability to accurately reproduce what we see naturally in real life. HDR aims to solve the problem of natural visual reproduction by introducing a luminance range that is more closely aligned with the natural world.  But, in order to increase nit count, brighter LED backlights are needed, which increases the overall power requirements of the system.

So, the real question becomes, how can display manufacturers reach the significantly higher nit counts required for HDR, while still meeting stringent energy-efficiency standards such as Energy Star in the U.S. and E.E.I. in China? Answering that question depends not only on how much luminance display manufacturers can squeeze out of the backlight, but also on which HDR standards become adopted moving forward.

Competing HDR Standards

The two most prominent HDR standards are focusing on the EOTF.

One standard has been developed by Dolby as part of their Dolby Vision platform.  This standard has also been formalized in SMPTE standard 2084.  SMPTE 2084 enables a peak luminance of 10,000 nits and uses a piggybacked metadata stream on all broadcasts.  The metadata provides information on how the respective content was originally mastered, which enables a compliant display to automatically adjust settings to optimally display the content. Because of this metadata stream, 2084 content is not backwards compatible, so it cannot be played on older TVs.

The competing standard, Hybrid Log Gamma (HLG), is being developed and backed by the BBC and NHK. HLG enables 17.6 stops of dynamic range on a 2000 peak nit display with a black level of 0.01 nits and is an evolution of current standards that adds a log function to the top half of a traditional camera EOTF curve. HLG is therefore backwards compatible, and HLG content can be played on both standard dynamic range or HDR TVs.

HLG appears to have an edge with broadcasters at this point due to its lower overall complexity and backwards compatibility.  However, it’s most likely that both HDR standards will be implemented in commercial displays. For example, LG announced their 2015 4K TVs would support both HLG and SMPTE 2084.

The Case for Quantum Dots

Whichever HDR standard eventually achieves prominence, one thing remains clear: manufacturers need a cost-effective path to HDR that delivers both amazing visual quality and energy-efficiency, and that’s where quantum dots come in.

In addition to higher luminance ratings, HDR TVs will also require WCG standards to achieve optimal visual quality, and at this point the most likely candidate for this is the BT.2020 (often referred to Rec.2020) introduced in 2012.

Commercial displays using quantum dot solutions — and edge-lit quantum dot solutions in particular — are the most energy-efficient way of achieving maximal BT.2020 coverage.

This is largely due to quantum dots inherent photonic efficiency, which results in nearly every photon of the incident light being down-converted into photons within a narrow spectral range. The result is that quantum dot TVs require significantly less energy for the same color gamut performance. As an example, our Color IQ solution requires between 50-100% less energy than traditional LED TVs at an equivalent level of color.

This gives manufacturers not only a clear path to energy-efficient HDR and wide color gamuts in their future displays, it also helps achieve one of my company’s primary goals: make the best color available to the most people.  No doubt this will be an increasingly hot topic of conversation in SMPTE circles, and with CES around the corner, the combination of HDR and quantum dots may just be the next big innovation story.

 

By John Ho, Product Marketing Manager, QD Vision

« Previous
Next »