Why 1080p doesn’t matter. Get over it.
|I’ve got to calm down a little bit before I really start pounding out this article. The fact that such glaring technological fallacies still persist, and amongst intelligent people, is just too much for me to handle without wanting to commit all sorts of homicide on a mass scale. Unfortunately I’ve come to realize that a lot of what people understand about the upper echelons of video quality in resolution and recording is the product of clever marketing campaigns. I want to tackle my grievances with the general public one at a time so I don’t devolve into a totally aimless rant.
I know this is going to be a tough one to overcome because many of you have invested in 1080p HDTVs and want to believe so badly that they’re better because they are 1080. Well, you’re half wrong and I’ll explain. There are a lot of factors that affect video quality that are also entirely removed from resolution. First, color quality has a huge effect on our perceived quality of televisions and not just brightness but accuracy. I don’t care what resolution your television is, if I watch hockey and the shadows on the ice look blue instead of grey then your television is crap or you don’t know how to tune it. Second, motion quality is key to perceived accuracy as well. I’m not talking any of this 120/240hz crap either (which I’ll save for another article). I’m talking about actual panel response time (grey-to-grey) in milliseconds. This number got lost amongst all this bullshit about contrast ratio and expanded color ranges. Educate yourself.
These things I’ve mentioned matter to your picture quality and resolution does as well. Most people just don’t have any understanding at what point it becomes a factor. This is easy to demonstrate with stuff at your house. Pull out a shirt and look at the threads that actually constitute the fabric. Now back away until you can’t see the threads anymore. Would it make any difference to your visual perception of the fabric if at this distance, more threads were added? The answer is a resounding no. Take a look at this chart.
To help you grasp this concept consider the 2011 Volkswagen GTI, of which I am the proud owner. In order to get a sunroof, I had to upgrade my radio from a 100$ basic CD player, to a 500$ touch screen job I was going to rip out anyway. Point is, if I wanted the higher quality product (GTI with a sunroof) I had to buy some shit that didn’t matter to me or the end result of my car. Same for televisions. The price of admission to higher end video quality is having to pay for a bunch of shit they claim to make a difference when it’s really the underlying processes and factors that either can’t be or won’t be explained numerically. I don’t blame it on corporations like Sony, LG, or Samsung (even though Samsung started all this bullshit) I blame the consumers for not voting properly with their consumer dollars. Don’t buy a Samsung surround sound just because your tv is a Samsung. It’s like picking Michael Jordan for your baseball team. Just stupid. Hopefully you will walk away with this article with a little more knowledge and skepticism about the world of consumerism. Don’t rely on the people selling you products to give you their opinion. Get yourself educated just enough to form your own. At this point I’ll be taking questions, queries, and harassing statements from people who don’t understand the things I have explained.
It’s true, 720p is just as good to most people as 1080p, and we’re about to embark to the 4K world.
What really matters is getting rid of 4×3 aspect ratios. The 16×9 aspect ration provided by 720p, 1080i and 1080p gets us closer to viewing movies as they appeared on the big screen, with much less pan-and-scan editing that destroys the cinematic experience. It gets us closer, but yes, there is another that is even better – 21×9. Not many TV’s out there support it yet, but to get the CinemaScope experience, we’ll have to upgrade yet again.
Last year when I was at NAB, nobody was optimistic about 3D, verifying my belief that it was to be a short-lived fad, a la AM Stereo (and for the most part, HD Radio).
You are spot on regarding the controversy and misinformation/propaganda surrounding refresh rates. A refresh scan rate of 120Hz is noticeably better than 60Hz, but 240Hz isn’t that much better than 120Hz on most TV’s. I suspect there’s a lot of spec fudging, just as there was and is in the computer monitor business. Gaming drives that business and true refresh speed rates of 5 or 6ms greatly enhance gaming over monitors that ‘only’ provide 11-15ms. However, most computer monitors have a 60Hz scan refresh rate, which matches the power cycle in North America (in Europe, it’s 50Hz). 240Hz, or 480Hz or even 960Hz are made-up specs, like 5,00,000-1 contrast ratios and 24bit color. True contrast ratios are often measured in the thousands, and the true color gamut of even the most expensive LED LCD TV’s is often way less than advertised, although unnoticeable by the vast majority of consumers.
As for myself, the next TV I buy will be 3D-capable at least, not for the 3D ability itself, which I have no use for, but because they often have better screens to begin with, like localized auto-dimming. LED LCD’s have better contrast ratios because of the ability to control light better, and produce blacker blacks, much like Plasma sets.
Best thing to do when shopping for a new TV, is take a favorite DVD, BluRay or ripped content on a USB stick and play it on the set you’re looking at. Sit or stand back at the same distance you normally view your existing TV at, and see how good it looks. And don’t forget to browse places like Pacific Sales and your local TV & Appliance centers for floor model and discontinued item discounts, which can be substantial, sometimes 50-75% for a previous top-of-the-line model.
I’ve been telling people this for ages…great to know that I’m not in this fight alone!
Heck, my vision is so bad, i have trouble telling 480 and 1080 apart .p