When shopping for a new display or projector, how do you determine which model has the best image quality, or even just the best quality for your limited budget? Unless you can compare two models in a side-by-side shootout, you'll have to base your purchase on trusted product reviews from sites such as this one, or the manufacturer's ads and brochures. Either way, you're going to encounter a variety of technical terms used to describe the potential image quality, including its ANSI Lumens, white light output, pixel resolution, contrast, color accuracy, and color bit-depth.
Ok, so you might have a hard time finding that last one. Color bit-depth is often hidden on the specs page or described in some obscure way. However, bit depth is becoming an increasingly important metric for comparing projectors that claim the ability to reproduce wide color gamut (WCG) and high dynamic range (HDR) content. In fact, it may actually tell you more about a projector's potential image quality than its contrast, pixel resolution, or even color accuracy ratings—all of which can be varied based on display modes or focusing accuracy.
What Is Bit Depth and Why Does It Matter?
Theoretically, a projector's bit-depth rating describes the highest number of tonal values and colors that it can reproduce in any given frame of content. As the bit depth rating increases (to a point, anyway), the number of colors and tonal values a projector can reproduce on screen increases exponentially, resulting in fewer jagged transitions and posterization effects (i.e., smoother blue skies), along with wider color gamuts and improved shadow and highlight details. The improvements are relatively easy to see as you increase bit depth from 1-bit to 8-bit per color, less intense between 8- and 10-bits, and difficult or impossible to notice between 10-, 11-, and 12-bits due to the limitations of the human eye.
How do you translate a projector's bit depth rating into the number of colors it can reproduce? Let's first take the example of a monochrome projector that forms a single grayscale image on the screen. Its numeric bit depth rating ("x"-bits per color) can be used to quickly calculate the projector's entire range of unique gray scale values, from its deepest black to its brightest white. All you need to do is apply the log function formula. For the grayscale calculation it's: 2x = number of gray values. The chart below (Figure 1) shows you the results of the math for both grayscale only or RGB color . For now, just have a look at the grayscale values; we'll discuss color later.
As seen in the illustration below (Figure 2), once an 8-bit grayscale or full color scale is achieved, you won't see the incremental benefits of 10-bits per color on your computer monitor or tablet, and probably not even on a true 10-bit display or projector driven by your computer or other internet-connected device. For starters, that would require a true 10-bit illustration (our illustrations are capped at 8-bits per color thanks to web color limitations).
Furthermore, you shouldn't be misled by the test patterns and even some movies available for download from the internet that claim to be 10-bit targets or 10-bit per color movies. Most are not what they claim! Unless you can download the test patterns as intact 16-bit TIFF format photos (all JPEGs are limited to 8-bits per color), you should quit while you're ahead. It's even harder to find animated 10-bit test targets and videos, as nearly all popular video formats available for download, including AVCHD and .MP4, are limited to 8-bits per color content. Even if you can actually find true 10-bit files available for download, you'll still need a computer with a 10-bit graphics card and 10-bit capable software. Otherwise, you'll wind up viewing a smooth 8-bit target or movie that shows no difference when viewed on an 8- or 10-bit display.
Using the ProjectorCentral 10-Bit HDR Grayscale Animation
Fortunately, there is a simple way for any serious video enthusiast to download and view 10-bit test patterns to help assess their display. All 4K UHD Blu-ray players have built-in 10-bit per color graphics capability for playing back 4K UHD Blu-ray movies—all of which are stored in 10-bits per color HEVC format video. Most of these 4K UHD Blu-ray players and a few 4K media players, including the Roku 4K HDR, have a USB input that enables them to play back animated 10-bit per color test targets that have been saved in 10-bit HEVC format.
If you'd like to see how your own projector handles 10-bit signals, you can download the 10-bit per color animated test target you see below (Figure 3), created by In-Depth Focus Labs, from ProjectorCentral.com. The spinning wheels display a 10-bit grayscale between video levels 0 and 20 on the left, and levels 20 through 100 on the right. Although it should appear as a grayscale image, it is actually a full color pattern containing metadata tags that should automatically turn on the HDR and WCG modes in any HDR10 compatible display.
To download the target to a Windows PC, you must RIGHT-CLICK on the link below, select "Save Link As" and save to your preferred location. The 10-bit HEVC file will download to that folder. On MacIntosh, right click and then select "Download Linked File" or "Download Linked File As."
Right Click to Download the Test Pattern Video File
To view the test pattern on your display, copy it to a USB flash drive and insert the drive into the USB media input on your UHD Blu-ray player. When you play the file from the disc player's built-in media player, it should be recognized by your display as a UHD resolution video with 10-bit bit depth, HDR, and BT.2020 color space.
As illustrated on the next page, obvious banding in the spinning wheels indicates that your display is playing back with less than full 10-bit bit depth.
Bring On The Color
Unlike a monochrome display, color monitors must form at least three grayscale images that represent the red, green, and blue data channels found in a standard SMPTE color signal. Most 3-chip projectors, whether using LCD, LCoS, or DLP imaging chips, start by using the data from each of the incoming R, G, and B data channels to form associated grayscale images. These are then illuminated by red, green, and blue lights (created by filtering a white light or using color LEDs or lasers) to form an overlapping full color image on screen (Figure 4).
Single-chip DLP projectors parcel out fractions of the R, G, and B data to form as many as seven grayscale images in rapid succession on the DLP micromirror device. Although none of these individual grayscale images contain the full number of tonalities found within the three individual RGB-based grayscale images, the totals should add up to the same in the end. White light from a bulb, colored LED, or a laser diode is then reflected off the DMD and passed through up to seven corresponding colors on a spinning wheel in order to form a full color image on screen.
In all of these color projector models, the total number of achievable colors winds up being the product of the grayscale values created, and are listed in the 8-bit row of the column labeled "Potential R,G, B Color Values" in Figure 1.
For example, here's the math in an 8-bit per color display that forms three grayscale images:
8-bits per color channel: 28 = 256 gray values.
Total colors: (256R) x (256G) x (256B) = 16.7 million colors
What Does Bit-Depth Look Like?
So, now you have some idea of how bit depth specifications relate to the number of grayscale gradations or colors that can be created by a display. But how do bit-depth variations actually look on the screen?
The type of artifact most closely associated with bit depth, or rather, the lack of it, is the banding artifact. It looks like what it sounds like: areas of the image that should ideally look smooth and show even transitions of light and color instead exhibit noticeable bands or outlines where the brightness or color visibly jumps from one level to another. The display simply lacks the ability to reproduce all the fine gradations called for by the signal.
Below (Figure 5) are examples of the 10-bit circular HDR grayscale target cited above as it should appear when properly processed at 10-bit depth (top), and with obvious banding as a result of being processed with only 8-bit or 9-bit depth (bottom). You can clearly see the banding steps in the darkest part of the test pattern, and more subtly, in the brighter part of the pattern.
As with the grayscale bit-depth chart shown in Figure 1 and the grayscale pattern above, differences in color bit-depth can also manifest visibly with banding—although the eye is more forgiving with certain colors than others. The illustration below (Figure 6), for example, easily shows banding patterns on most displays between 12-bit and 24-bit color, but these would be harder to see when comparing 20 and 24 bit colors.
In real world content on most 8-bit per color displays, you might perceive bit-depth banding issues in the transitions of light levels and colors in a sunset, or in the different hues of blue in a sky. Other bit-depth artifacts can be seen around the edges of objects, such as the transition between a planet in outer space and the halo of light surrounding it, or when one saturated color ends and another begins. Instead of a smooth tonal transition, you see a line or edging effect. For example, in the illustration below, shot in 4K HDR with 10-bit color depth, compare the out-of-focus, violet-tinged flowers behind the butterfly. The top frame in Figure 7 shows the out-of-focus flowers as they should appear with proper 10-bit processing. Below that is the frame processed at 8 bits per color.
If you look at the 10- and 8-bit close-ups shown below (Figure 8), you can clearly see halos around the edge of the 8-bit flowers as well as a visible dark band that outlines it. Also notice the missing details within the 8-bit flowers.
For another example, consider the photos below in Figure 9 of a real color spectrum—a double rainbow. The photo on top is processed with 10-bit color, while the bottom image is processed with 8-bit color. The 8-bit version suffers not only from posterization and banding artifacts in the sky, but the fainter of the two rainbows (the one on the left) practically disappears due to the combination of posterization and truncated color gamut. This is the kind of image degradation you might expect if you saw banding in the 10-bit grayscale test pattern discussed above, and illustrates what happens when three overlapping grayscale images (each with distinct banding issues) are used to form a color image.
Understanding Bit-Depth Specs
Having a basic understanding of how bit-depth relates to image quality isn't really enough when it's time to start shopping for a projector. Here are a few other important things to know before you start sifting through product marketing sheets and reviews.
X-bits per color, X-bits pixel, X-bit color. Unfortunately, all of these terms are widely used to describe bit-depth capabilities in displays, which creates confusion. They don't always mean the same thing.
If we're talking about a monochrome display—something you won't likely be shopping for anytime soon—all three terms do refer to the same value. So a display deemed to have "8-bits per color" can also be described as an "8-bits per pixel" or simply an "8-bit color" display.
But in a full color display—more relevant to today's projectors—the term "X-bits per color" describes the number of tonal values found in each of the three grayscales formed from the video signal's R, G, and B data channels (as described in the previous section). So, if X=8, then "8-bits per color" generates 256 tonal values per color.
On the other hand, with color displays the terms "X-bits per pixel" or "X-bit color" describe the total product value of all grayscale images. So in this case, you can expect to see the value of X expressed as three times the value you'd see associated with the term "X-bits per color." Therefore...if you see "24-bits per pixel" or just "24-bit color," it means the display has the same potential colors as a display labeled "8-bits per color." Easy to understand...if you're a math major.
Millions vs. billions of colors. Why do some manufacturers list a display's color capability in terms of millions or billions of colors? The answer is simple: marketing! After years of describing display color capabilities in 8-bit terms, manufacturers came up with a description that made more sense to buyers: "millions of colors," or 16.7 million colors. Then 10-bits per color displays came along and the next logical marketing step was to claim "billions of colors," or "over 1 billion colors!" (1024R x 1024G x 1024B = 1.07 billion colors). To really confuse you, one manufacturer even lists its projector's color capability as 1,074 million colors! (the same as 1.07 billion colors, but a bigger number up front).
The problem with claiming billions of colors is that there is actually no such thing! According to vision experts, the average human can only discern around 12 million individual colors, while only a handful of humans found worldwide (all female) can discern about 100 million colors. Their expanded ability is due to a genetic mutation that produced a fourth color sensitive cone in their eyes. Most with this ability don't even know it until tested, but I'll bet they enjoy a good sunset or rainbow when they see it.
Samsung is about the only manufacturer who has tried to set the record straight by claiming that its 10-bit displays are capable of displaying "Billions of color data combinations"—a technically accurate statement. Few have followed.
Let The Buyer Beware
For more than a decade, advanced photographers, videographers, and film directors have been aware of the advantages of capturing and processing color images and video with a minimum of 10-bits per color (30-bits per pixel). The RAW modes on all DSLR cameras store still photos in 10- or even 12-bits per color, and affordable 4K camcorders now have similar capabilities. On the computer side, every Mac currently sold has at least 10-bits per color graphics capability, as do the majority of PCs, image and video editing programs, and 4K or higher-resolution monitors used for image editing and advanced gaming.
However, it wasn't until 4K UHD Blu-ray movies and players became available, enabling the distribution of high dynamic range (HDR) and wide color gamut (WCG) content to a home audience, that 10-bits per color became an important feature for both flat panel TV's and projectors. Before that, the marketing of displays and projectors had concentrated on increased resolution and in some cases, improved color accuracy and extended color gamut reproduction. In 2015, 10-bits per color became the minimum acceptable color standard when the CEA released its minimum guidelines for HDR10-compatible displays and projectors, which included a 10-bit requirement under the HDR10 Media Profile. Here are the parameters:
- EOTF: SMPTE ST2084
- Color Sub-Sampling: 4:2:0 (for compressed video sources)
- Bit Depth: 10 bit
- Color Primaries: ITU-R BT.2020
- Metadata: SMPTE ST 2086, MaxFALL, MaxCLL
The simplicity of the CEA definition may have created more confusion among consumers than it eliminated. A deeper read shows that all a projector or display has to do in order to claim "HDR10-compatibility" is to accept an HDR content signal containing 10-bits per colors data that's stored using BT.2020 color space coordinates and includes appropriate HDR metadata tags. But HDR10-compatible displays and projectors are not required to maintain 10-bits per color from input to output, or even reproduce any wide gamut colors outside the standard dynamic range (SDR) Rec. 709 color space. That loophole was intentional, and left the door open for more affordable and "older-technology" 8-bit displays that are limited to Rec. 709 color gamuts (or slightly more) to be re-engineered to accept HDR and wide gamut color content from 4K UHD Blu-ray players without choking.
The TV industry has always prioritized backwards compatibility, and in this case it can be done with some internal processing tricks on the display or projector side, or within a computer or stand-alone media player. The result is that some displays with limited bit-depth capabilities are labeled as HDR-capable, but don't really meet the criteria or deliver the full image quality benefits of 10-bit HDR displays.
Here's how it typically works for an 8-bits per color display claiming to be HDR-compatible: When an incoming 10-bit HDR movie signal is detected, a front end processor in the display downsamples the signal to 8-bits per color data, or creates dithered 10-bit colors. Next, the display applies a reverse HDR or HLG curve adjustment to counter the EOTF 2084 contrast curve applied during the HDR mastering process. A color look up table (LUT) is then applied to scale all the wide gamut colors the display can't reproduce to the closest in-gamut colors that it can reproduce. Additional image tweaks may include selective saturation, contrast, and blurring adjustments to minimize posterization and banding artifacts.
The result on the screen lands in between a SDR 8-bit image and a 10-bit HDR image. You may still see some wide gamut colors in the 8-bit display output, as 10-bits per color is not required to create many of the DCI-P3 gamut colors that fall outside the smaller Rec. 709 standard color gamut. However, no reasonably affordable 8-bit display or projector can achieve 100% coverage of the DCI-P3 wide gamut color space used to master and color grade 4K UHD Blu-ray movies, and 10-bit or higher color is required to achieve the additional colors found in the full BT.2020 color gamut.
Unfortunately, in addition to color gamut limitations just described, downsampling and dithering also causes image-quality artifacts including posterization effects, lost shadow and highlight details, banding in fine color gradation, and outlines appearing along the edges of fine tonal transitions (as shown above in Figures 6 through 8). All of these problems are overcome by a display or projector with true 10-bits per color processing and the ability to reproduce a color gamut approaching or exceeding 100% of the DCI-P3 color space.
Projectors and flatpanel TVs with true 10-bit processing and the improved image quality it enables are out there and more affordable than you might think. But they're competing with some "HDR10-compatible" models that claim all sorts of HDR advantages yet don't reveal their 8-bit limitations until you see their output on screen, or learn about it in a product review. The lesson? If you're in the market for a new projector, make sure you do your homework.
Michael J. McNamara is the former Executive Technology Editor of Popular Photography magazine and a renowned expert on digital capture, storage, and display technologies. He is also an award-winning photographer and videographer, and the owner of In-Depth Focus Labs in Hopewell Junction, NY