Does your projector do TRUE HDTV?

Evan Powell, January 20, 2005
Contents

Have you heard this one yet from a projector salesman.... "You don't want to buy THAT projector...it doesn't do TRUE HDTV." Well, certainly nobody would want to buy a projector that didn't do real HDTV, right? But they all claim to do HDTV. So what's the scoop?

It is easy to understand why the confusion exists. But it is also easy to sort it all out. First, let's start by defining HDTV. There are two common HDTV formats in use today, usually referred to as 1080i and 720p. The numbers refer to the number of horizontal lines in each frame of video (also known as "vertical resolution" since it is the number of horizontal lines as counted vertically from top to bottom of the screen). So in a 1080i signal, there are 1,080 lines per frame of video, and in a 720p signal there are 720 lines per frame.

The "i" and "p" indicate whether the signal is interlaced or progressive. In an interlaced signal, all of the even numbered lines are transmitted in one batch, followed by all of the odd numbered lines. (This is done to reduce transmission bandwidth.) In a progressive signal, all lines of the frame are transmitted at once in sequence. So with the interlaced 1080i signal, only 540 lines are recorded by the camera and transmitted at a time; they are then reassembled at the time of display. Meanwhile, with 720p, all 720 lines are recorded and transmitted in sequence.

Both of these signal formats maintain a 16:9 aspect ratio. That means the picture is 16 units in width for every 9 units in height. This is what has become known as the standard widescreen television format-all widescreen format TVs, plasmas, and projectors have a native 16:9 aspect ratio these days.

In order for an HDTV signal to maintain a 16:9 aspect ratio that matches the widescreen format, it needs to have 16 pixels on each line for every 9 lines of video in the frame. So a 1080i signal has 1920 pixels horizontally. That is why you will sometimes see the actual resolution of the 1080i format designated as 1920x1080. (If you divide 1920 by 16, then multiply the result by 9, you get 1080.)

Similarly, a 720p format signal has 1280 pixels on each line. So the physical resolution of the 720p format is often noted as 1280x720. (Once again, if you divide 1280 by 16, then multiply the result by 9, you get 720.)

So far, so good. Now....what is TRUE HDTV? This is where it gets confusing, because people use the term to mean different things. Some people think that the only real, legitimate HDTV format is 1080i because it has the highest physical resolution. So they refer to 1920x1080 as true HDTV. Others have been calling 1080i "full HDTV," presumably to distinguish it from the less full 1280x720.

Fans of the 720p format object to this. They point out that progressive scanning produces a cleaner, higher resolution signal when the subject is in fast motion. It has no deinterlacing fuzziness. And since the 1080i camera captures only 540 lines at a time, the actual resolution of 1080i when the subject is in motion is only 540 lines, not 1080. So many folks think 720p is better for rapid motion sports like football and soccer, while 1080i is better for, say, golf, where people are just basically standing around.

The fact is that both 1080i and 720p are great HDTV formats that look a lot better than standard television. Both formats are being broadcast by the major networks today, so your projector needs to be able to display both of them, and all projectors that are HDTV compatible do in fact display both of them.

So what does it mean to ask "does your projector display true HDTV?" Often what is really meant is, "does it need to re-scale the image?" In other words, does the video information coming in on the HDTV signal need to be either compressed or expanded to fit the physical resolution of the projector? In most cases, it does.

Contents: Background Fact from Fiction