Use this form if the comment contains offensive or otherwise inappropriate content. An email message will be sent to our moderators who will take appropriate action if necessary.

Write your message to the moderator below:

(Enter the numbers exactly as they appear to the left)

Comment text appears below:
Is the UHD65 actually able to display 10-bit color?

This has been puzzling me for some time now: working out the math...

Being a single chip, it must display all 3 primary colors for each frame itself. Furthermore, being a half-UHD resolution chip, it must display all 3 colors of each frame 2x.

In 10-bit color, each primary color has 1024 possible values.

So 1024 shades of each color x 3 colors x 2 = 6,144 = the number of times each micro-mirror of the DMD would need to be able to flick per frame.

Most projectors don't actually show 24 fps content at only 24 fps; they typically show it at 48 or 72 fps to avoid flicker. But even if the UHD65 only shows 24 fps, that's 6,144 x 24 = 147,456.

And if it were showing 60 fps, that's 6,144 x 60 = 368,640.

Now, I could easily be mistaken, and I'm hoping someone might be able to confirm that I am, but I have seen figures like "100,000 times per second" and "120,000 times per second" as the number of times each individual micro-mirror in a TI DMD is able to flick per second.

Something around 150,000 flicks per second seems plausible. But nearly 370,000?

I'm not saying it can't be done. I'm really hoping someone can tell me if TI's newest DMD really can flick that many times per second!

But unless my math is incorrect, we're definitely looking at more than 300,000 flicks per second in order to reach 10-bit color at 60 fps, correct? I mean, even if we subtract all of the "below black" and "above white" values from a standard video signal, we're still talking about 880 gradients per color. So 880 x 3 x 2 x 60 = 316,800.

Please, if anyone can explain this to me, I'd greatly appreciate it!