MPEG - a Blight Against Video Quality
Watching music videos tonight, I realized something: MPEG video encoding is a crime against quality.
Back when bandwidths were small and expensive, MPEG made near-full-speed video possible. Without severe compression, we wouldn't of had digital cable until years later. But this amazing technology comes with a serious cost.
MPEG compression removes most of the data, and sacrifices some of the video quality to do it. The higher the compression rate, the less data you end up with (less to transmit or store), but the worse the quality gets. To my knowledge, there is no MPEG compression that doesn't lose at least some of the quality. "Some" quality loss that your eye doesn't really notice, is fine. But MPEG has some serious drawbacks, it screws up when the entire screen is changing a whole lot, quickly - like during explosions in an action movie.
I'm not going to explain about Key frames and all that - there's plenty of web sites where you can learn how MPEG transmits the full frame of video only once in a while, sending "what has changed on the screen since the last frame" most of the time.
I'm just going to say that MPEG is so wrong for many reasons.
In the old days, with analog TV, you could flip through channels very quickly. A high-quality VCR could "tune" from one channel to the next in about 1/10th of a second. This was awesome - you could flip through channels and tell very quickly whether you wanted to watch a channel or not. Experienced TV viewers could tell in an instant what was an ad and what was a show; it's a finely-honed skill. But with MPEG compression, this is no longer possible! MPEG only transmits a key frame about once a second, or less. So, when you tune into a new channel, you've jumped right in the middle of "this changed, that changed, and on the next frame, this changed, that changed"; your tuner has no idea what it's talking about until the next Key frame goes by! Visually, you see: nothing. Black screen. For up to a second, sometimes longer. Then the picture appears. It's really irritating!
What I'm saying is, no matter how fast your computer/HDTV/Bluray/Tivo/Roku/Google TV/Digital TV Tuner/cell phone gets, it will NEVER be faster at tuning. Never. Not with MPEG, anyway. It's not a "hardware is slow" issue - it is a design flaw in the MPEG compression itself.
Suppose you turned on CNN Headline News, and the title along the bottom says "Dow Jones average", and the news anchor person is saying "it went up 5 points just now... now it went down 3 points... now it went up 13 points... now it went down 6 points..." And you wonder, what is the Dow right now? Nothing on the screen is telling the exact number. You're getting second-by-second updates about the number, but you don't know what the number actually is. You don't know, because you didn't hear what the price was when they started reciting the changes in price; plus, you missed some of the changes in price they said earlier. Now imagine that once every hour they'll actually tell you the exact price - "it's 11524 right now", then they continue reciting all of it's ups and downs. Since you heard the last real price, you have a chance of tracking the number during the next hour - by adding and subtracting the intermediate numbers they tell you, from that top-of-the-hour number of 11524.
That's sort of how MPEG works. If you miss the top-of-the-hour number, you gotta wait an hour to hear it again; you can't tell anything useful until then. For MPEG it's about once a second, which is a long time in video-land. It's irritating when you're tuning channels.
And nobody complains about it. They just live with it. This pisses me off! There are technological things that could be changed to fix this problem. One fix would be to change the MPEG compression to transmit Key frames more frequently. Yes, this would take up more bandwidth per channel - and it's WORTH IT for quality improvement - but if nobody complains, it won't be done. Why should media companies change their ways, if people put up with what they have today?
Another possible trick would be to use multiple tuners, 2 or 3. When you're on channel 15, say, it's already tuned in channel 16 and 17, and is waiting for the Key frame to go by. If you suddenly change to channel 16, the channel 16 tuner actually saw the keyframe go by at some point during the last second - so it can show you the video image INSTANTLY. The old channel 15 tuner now can become the channel 18 tuner (so your box is tuned to channels 16, 17, 18, and you are viewing 16). This solves the problem so long as you don't try to flip channels faster than 3 per second. Not great, but somewhat better than today. But what if the person changes channels down, instead of up? Maybe you need 1-2 extra tuners in that direction, too. 5+ tuners? That's a lot of extra hardware, and extra cost, to work around a software problem! And what if the person jumps to a new channel that's not in sequence? You didn't have that channel tuned, so we're back to the full second-plus delay again. Maybe you need a tuner on EVERY channel? That's not a practical solution for your hardware to do. Frustrating.
The Key-frames-not-sent-often-enough problem hurts you, the viewer, in other ways, too. The most action-packed scenes in a movie are often RUINED by MPEG encoding. When everything is flashing on the whole screen and changing quickly - explosion with parts flying everywhere, there's no way MPEG can handle it - it has to transmit Key frames for nearly every frame, which is too much data, so the number of frames per second drops dramatically - sometimes to 2 frames per second! This most important part of an action movie is now ruined, reduced to a slow-moving slideshow of fireballs - ruining the height of excitement, and reminding you of uncle Jethro's neverending slideshow of last year's family wiener roast. Just think. All the trouble and expense Hollywood put into making this one-shot-only scene, and they can't even record it on DVD properly for their customers to see. Incredible.
Sometimes the tree outside my house partially blocks the digital TV dish in my back yard, so some of the digital data gets lost. If the lost data happened to be some of the changing-frames, I only notice a little blurriness of the picture and it goes away quickly. But if a Key frame got damaged or lost, look out! The entire screen freaks out with wild colors and crazy blockiness, for at least 1-2 seconds of time! Lot's of green, usually, which is strange. Anyway, it looks horrible, like the person on the screen was stuck in mud, and it's sticking to them as they move around; then just as suddenly, it's all clear and working again. The 1-2 second weirdness was because the change-frames were describing changes, but my TV had the wrong Key frame data to compare it against. The screen cleared up when the following Key frame came through undamaged. I always know it's time to trim the tree when this happens.
The MPEG standard was designed and chosen back when dialup Internet was standard, and digital video was a tiny rectangle on a computer screen only. Today, all this has changed. If you "only" have 1Mbps Internet, that's considered slow. Half the web pages now have video advertisements playing alongside your normal content. And a wide variety streaming movies and clips play easily from a variety of sites like YouTube and Netflix. We have more bandwidth now. Our computers are 8X faster, now, and have Terabyte hard drives in them for local storage. I think we can handle updating the way we use MPEG to use a little more data, and improve quality dramatically at the same time.
Boy it's a good thing people don't complain enough, or somebody might actually have to fix this blight. Sometimes I wish we could return to the speed and simplicity of the "good old days" of analog television.
tl;dr: MPEG sux. TV channels should use more bandwidth to give us QUALITY video for a change. TV's moving to the Internet anyway; everything is. Just do it already.