A Further Definition of High-Def


The transition to all-digital broadcasting is about to get into full swing, with cable operators jockeying for field position with local TV stations as the way signals get delivered to customers changes irrevocably (“Pulling Back From the Digital Cliff,” page 19).

As this change gains momentum, more and more Americans will begin to turn on to the signature product of the move to delivering pictures in digital form: High-definition television.

But there's one point Dom Stasi would like to make about the stampede now under way from old-style analog sets in “standard” definition to new-style digital sets in “high” definition.

Analog, he said at the Independent Show, is always better than digital.

It's the engineer's view. Information theory and its guru, Claude Shannon, back him up, he said.

Stasi is the chief technology officer of TVN Entertainment, which supplies cable operators with hubs that control the delivery and playback of their on-demand programming.

And he's been around. Places like MTV Networks and HBO. Liberty Media. Even the Apollo space program.

The point stands to reason. With analog pictures, waves carry exact replicas of the images that come in through the lenses of cameras. A picture is copied a line at a time and the sharpness is determined by how many hundreds of lines are created and then sent over the air to homes.

With digits, it's really only a matter of how much compression of information you are trying to achieve. You're sampling the image, creating discrete bits of information. And trying to figure out just how few bits you can send to have the picture reconstituted at the other end.

Of course, analog signals run into interference on the way, which degrades the picture. But, technically speaking, analog technology sets the standard for flicker-free, high-resolution images, in Stasi's book.

The problem, Stasi notes, is that delivering high-resolution analog pictures requires lots of bandwidth. Twenty years ago, the high-definition pictures being generated by Toshiba and other Japanese TV makers were gorgeous — and analog. But they took 22 Mhz of bandwidth to deliver, according to Stasi.

By comparison, a standard-definition TV channel takes up 6 Mhz. And the Federal Communications Commission is booting broadcasters' analog signals off the airwaves to free up more space for better uses of spectrum.

But the fact is, the move to high-definition digital television is actually a step down in quality, according to Stasi and engineers like him. The benefits of digitization are in flexibility and features. You can pack more signals into a given amount of bandwidth. You can put lots of different pictures on the same screen, overlay text messages, program guides and ad windows.

That doesn't mean the picture is as good as it gets. If the over-$100,000 set really wanted high-definition pictures, they'd have insisted on analog versions decades ago.

But the affluent didn't. Because such sets themselves could have cost $100,000. Like Ferraris.

Now, engineers like Stasi are left to figure out just how much of a picture to leave out and still call it “high definition.” And how far to take the conversion of a “high-definition” digital signal into other streams of information that can be reconstituted into TV shows on different kinds of TV sets.

If cable operators have their way, for instance, analog TV sets actually won't go away, after the digital-TV broadcasting “hard date” passes in Feb. 17, 2009.

They'll “demodulate” the high-definition digital signal they receive from a local TV station into a standard-definition digital signal, for that batch of sets that can handle digits but not lots of them, on screen.

Then, they'll “remodulate” the signal into an analog version for all those tens of millions of sets that otherwise will be tossed in the garbage because they can't handle digits at all.

That process at the start of the year was called “downconversion,'' until the National Association of Broadcasters called it “stripping” out details of its members' signals.

Then, it became known simply as “conversion,” to avoid the connotation that quality was being taken out or “down.”

But to Stasi, that's not the right nomenclature for describing what happens when digits are turned into wavelengths, again.

“That's not conversion,” Stasi said. “That's 'analog enhancement.' ”