Reader’s Guide to 'Up-Rezzing’


Three things are indisputable about high-definition television: More channels are coming, more homes can afford the HD sets to display them, and consumers harbor big opinions about which service gives the best — and worst — HD pictures.

Right now, those opinions are all over the place. Depending on the blog, the “hands down best” HD pictures come from AT&T. And DirecTV. And cable, and EchoStar, and Verizon.

Ditto for the “hands down worst.”

DirecTV is on the receiving end of most of the recent blog angst, primarily because it’s moving at ramming speed toward its year-end promise of 100 channels.

It’s risky, though, to take comfort in DirecTV’s woes.

Why? Because there is no official technical benchmark for what constitutes “an HDTV picture.” Ditch any daydreams about pressing a button on the remote to find out what resolution you’re getting, like you can do on the PC to find out what broadband speed you’re really getting. So far, it doesn’t exist.

What we’re left with is … people’s opinions. Even the experts in picture resolution are quick to point out that quality is highly subjective. My eyes see differently than yours, and your eyes see differently than the person nearest to you right now. It’s a byproduct of being a human.

Plus, glitches in picture quality can’t easily be pinned to how a program is distributed — over satellite, over cable, over fiber, over copper. From the time a program is created to the time it shows up on your snazzy new flat-panel HD set, it’s probably been “touched,” meaning manipulated, at least four times.

Then there’s the simple fact that today’s larger, higher quality TV sets show glitches — larger and more distinctly.

(Aside: At the Consumer Electronics Show, in January, a chief technologist from a major program network said that his big “aha” was that for the first time ever, TV displays outperformed distribution networks, in terms of how much picture information they could display.)

The blog buzz seems to center on what programs are “true HD,” versus an “up-rez” version. (“Up-rez” is HD shorthand for “up-resolution.”)

Here’s what that means: At any program network, right now, some fraction of its content library was mastered in an HD format. The rest was not. The latter category will need more bits, in order to “look good” on those big, beautiful HDTV displays. That’s the up-rez.

The extent to which a program or movie can be “up-rezzed” also depends on how it was stored. If it’s on film, you’re good. If it’s on videotape, not so good.

The process for creating master film reels in a digital, high-definition format is known as “telecine” (pronounced “tele-sinny”). It usually starts with a clean-up, to remove any dirt, scratches, hair or other visible glitches. It’s expensive and time consuming, but in the end, it’s true HD.

The process for up-rezzing videotape content is less accurate, and is part (part!) of the reason why some pictures look better than others on HD screens.

Up-resolution, as a technique, has two main components: Line-doubling, and interpolation.

Line-doubling is a method used on the vertical part of the picture, as it is “drawn” on the screen — more lines, more bits, more picture. Interopolation is the addition of bits within the horizontal lines, in a way that is hopefully creative enough to estimate what’s really happening in the picture. If the interpolator sees a line of red dots, maybe it adds another red dot.

The point is, up-rezzing is based on clever guessing, but it’s still a guess. It is not “true HD.” From there, that up-rezzed show is compressed and sent along — where it might be compressed, uncompressed, and recompressed a few times before it gets to the HDTV in front of Consumer Jane’s couch.

Right now, the name of the HD game is volume: Who has the most channels. The next chapter will probably be about quality.

Quality, in picture resolution, is about how much picture information there is, which depends on how many bits are used, which depends on how much bandwidth is available.

Stumped by gibberish?