Here’s one that just never gets old: Picture quality.
For marketers of HDTV service, picture quality is the differentiator that comes after “more channels.”
For bloggers, picture quality elicits a steady flow of commentary and frame-grabs about who has the best (or worst, usually) HD picture.
For video engineers, picture quality screams for empirical aid. A solid, repeatable way to quantify picture quality is something most video engineers have long craved.
Here’s why measuring picture quality is so hard: My eyes see differently than your eyes. Your eyes see differently than the person sitting next to you. They just do. Because we’re human, what constitutes a “good picture” is largely subjective.
This matters, especially now. For the first time ever, in the 70-year history of television, the picture quality capabilities of some (1080P) HDTV sets surpass what can be fed into them, by just about any service provider: Cable, satellite, or telco.
Blu-ray DVD players do match 1080P sets in picture quality capability. Mostly this has to do with the frames per second get blasted into the HDTV – 60, for bluray DVD, and 30, for just about everybody else.
That means that as consumers begin to replace their DVD players, the pictures dazzling forth from their new Blu-ray machines will look better than the pictures coming from multichannel video providers – even, and perhaps especially, the HD stuff.
Ditto for handheld, 1080p HDTV cameras, as they price their way into the consumer mainstream. Do you want your dog to look better on your HD screen than your favorite channel? (Don’t answer that.)
NOT A NEW PROBLEM
The problem of quantifying video quality goes way back. In the late 1950s, for instance, the Federal Communications Commission established a group called the Television Allocations Study Organization (TASO, said as “tay-so”).
Part of TASO’s mission was to establish a 6-grade scale for video quality. A TASO Grade 1 meant excellent viewing. TASO Grade 6: Unviewable picture. Grades 2 and 3 were the goal, and became the norm.
TASO grades were scored by an elite batch of “golden eye” experts. This is sort of like seeing around corners, except it involves seeing artifacts in TV pictures that other people usually can’t.
A parallel also exists to quantify audio quality: The MOS scale, which stands for Mean Opinion Score. It also ranks perceived audio quality, for phone calls and music, on a 1-5 scale.
The challenge now is to find a way to do for digital video what TASO did for analog video, and MOS did for audio. That’s what’s cooking at companies like IneoQuest, Imagine Communications, and Symmetricom, among others.
This tale will go on for as long as there is competition for HDTV displays and services. Best get your golden eyes on.
Stumped by gibberish? Visit Leslie Ellis at