Magazine

Buying The Right HDTV (Part One)


If you're like a lot of folks, you didn't buy a high-definition television for Christmas, but you're thinking hard about getting one this year. The good news is that prices are likely to continue their sharp decline in coming months. The bad news is that the consumer-electronics industry is making TV shopping unnecessarily difficult.

The biggest problem is a profusion of standards, along with the occasional made-up terms that make it hard to compare products. In this column, I'll try to guide you through the thicket of terminology, and next week I'll help you choose among the myriad display technologies.

The first point of confusion is digital television vs. HDTV. While the terms are often used interchangeably, they shouldn't be. All HDTV in North America is digital, but not all digital TV is HD, and it's the high-definition part that gives you the big improvement in image quality, especially on large screens. A second issue is that manufacturers bill some displays as "HDTV-ready." This means they're capable of showing an HD picture -- but not without a $400 or so digital TV receiver. Although this is a bit deceptive, the separation may not be a bad thing. A stand-alone receiver offers more flexibility, for example, to deal with the copy-protection schemes that will be used with some broadcast and cable HDTV content starting next year. My personal choice would be to go with a pure monitor and a separate tuner/receiver, but a lot of people prefer integrated solutions because they don't want the hassle of setup.

UNDERSTANDING DISPLAY CHOICES requires delving into the mind-bending world of TV standards and formats. A TV display consists of hundreds of so-called scan lines -- the more lines, the better the picture. Standard U.S. analog TV uses 480 lines, but these are handled in a peculiar way. Every 60th of a second, all the odd-numbered lines are drawn; then in the next 60th of a second, the even lines. This process, called interlacing, produces 30 frames per second, but the resulting image is generally less sharp than if the lines had been displayed sequentially in what is called progressive scan. In describing these two approaches, a letter "i" follows a number indicating interlacing and "p" for progressive scan.

As defined by the Arlington (Va.)-based Consumer Electronics Assn., true HDTV requires at least 720p. Many broadcast stations, however, offer only the 480i format on their digital channels -- which is little or no quality improvement over standard TV. You can do a bit better with a DVD player that offers progressive scan. Its output is 480p, which the CEA calls Enhanced Definition TV. When used with a wide-screen display, it provides improved quality. But it's not HD.

To qualify as true HDTV, a display must also have a 16:9 screen, about 1 3/4 times as wide as it is high -- similar to the format of movies than television. Regular old TVs have screens 1 1/3 times as wide as high. HD broadcasts offer either 720p or 1080i. HD displays automatically convert the signal to the best quality they can handle. On screens up to around 40 inches, more than 720p offers little or no quality improvement. On bigger displays, a maximum resolution of 1080 lines is highly desirable. But any HD format offers a dramatic quality improvement over standard TV.

One thing to beware of: displays that boast they can handle 720p or 1080i inputs while mumbling about their actual display resolution. Some less expensive models, sometimes called "HDTV-compatible," convert the HD input to display at 480p. Since it's the output resolution that determines what you see, you'd be getting a wide-screen TV at standard resolution.

Now that you understand the mysteries of digital formats, you're ready to choose among plasma, LCD, projection, and CRT displays. But that has to wait until next week. By Stephen H. Wildstrom


Steve Ballmer, Power Forward
LIMITED-TIME OFFER SUBSCRIBE NOW
 
blog comments powered by Disqus