There’s been a misconception in television for a while now and I’d like to clean it up. I’ll start with: some of you may have noticed the preposition in the previous sentence is “in” not “on.” “On” television implies a program of some sort. The problem is with the television itself. You most likely have one of these televisions and you’re calling it by the wrong name.
It started many moons ago with the concept of the pixel. (Yes, there are going to be a lot of numbers in this particular posting but, hey, it’s not another baseball post.) Television has always been about pixels, at least, as far as quality is concerned. A pixel is defined as “the smallest element of an image that can be individually processed in a video display system.” You’ve heard about them for years. Especially when computers started coming home. VGA and SVGA video cards. Then Hi-Definition or HDTV. Now we’ve got 4K and 8K wandering into our homes. But 4K is somehow twice the quality of HD, when full HD as been defined for years as 1080. But 4000 is 4 (roughly) times more than 1080, not twice. It’s also not really 4000 (4K) but 3840 rounded up.
So, here’s the deal in a nutshell. For some reason that I’ve never been able to figure out, television quality has been defined as the number of pixels tall an image is. The vertical measurement. Back in the day, a whopping 20 years ago, the pinnacle of television was DVD and a 480p picture. That’s 480 pixels tall. And it was wondrous. Then came high definition television, HDTV, and it had two qualities – either 720p or 1080p. 1080 was, of course, the better of the two. But it had a problem.
The measurement was still vertical, although not always consistent. 1080p only really works as a measurement if the picture uses a 16:9 aspect ratio (picture size). Broadcast television uses that so broadcast picture was usually just fine. Movies, however, are a different story. Some don’t use a 16:9 aspect ratio. What that means is that if it’s width/height, then the width is 1920 and the height is 1080 (there’s your 1080). The problem come in when you have a movie that isn’t 16:9.
I’ll try to simplify. Let’s look at the 1988 Christmas classic movie Die Hard. (That’s right! It’s a Christmas movie!) Die Hard was filmed using 70mm film and was very high quality video-wise. But it wasn’t 16:9, it was 2.34:1. What does this mean? When you watch it on your HDTV, it’s going to have those black bars at the top and bottom so the whole width of the picture can be shown. And THAT is why we’re doing it wrong.
HDTV should not be measured in vertical pixels, but width. You don’t have a 1080p television. You should have a 1920p television. That’s the width (and how they can call 4K twice as much). Die Hard is not 1920 by 1080. It’s 1920, so it has the width, but the height is only 817 – not 1080. If you’re watching it right now (while reading this at the same time somehow) you’re not watching a 1080p version. You’re watching an 817p version.
Now that 4K televisions are becoming less expensive and more prevalent in home uses, we’re starting to see that whole errant number thing fixed. But, in case you didn’t know where those numbers came from, now you do. It comes from width. So if, on the off chance, someone asks if you’ve got a 1080p HDTV, you should tell them, “Nope. I’ve got a 1920p HDTV.” Then watch their mind slowly melt and ooze out their ears. It’s a lot of fun.
Now the misconception has been cleared up and you can resume watching your regularly scheduled program. (As if there is such a thing anymore between streaming and DVRs!)