You probably own a TV, right? Then I am pretty sure that the picture you are getting is pretty shitty. Why? Because nowadays it seems it requires a degree in the mixed discipline of setting up a TV set picture like it should and 99% percent of the people don’t even know what RGB means. And if you belong to that 99 % that’s totally fine because you really shouldn’t care. Instead, the people creating those technologies have — excuse me — fucked up and you deserve better. No matter if you spent a few hundred bucks or a few thousand on a TV set.
This is kind of a nerdy rant, but I am going to explain the technicalities so that everyone should be able to understand them… and also to give some advice on how to get a decent picture out of your TV.
The Basics: Color
How does a TV work? Basically, it throws light into your eyes that has different colors. Do you know how colors work? From elementary school, you probably remember the subtractive color system (not that you knew that it was called this was). For example, if you added some blue to yellow you ended up with some green. If you mixed all the colors, you ended up with some brownish crap and your teacher would ask what you were doing.
Colors created by light work a bit differently, those are called additive colors. If you mix everything together you get white. If you don’t have light at all it’s black.
There are a few special colors with which you can basically create any other color by mixing. For the subtractive system, those are Cyan, Magenta, and Yellow (you know those from your printer, right). For the system of your TV, those are Red, Green, and Blue. Using just these three, you can mix whatever you want.
The Basics: Resolution
A modern TV works by dividing the picture into very tiny boxes called pixels. The amount of pixels is called resolution. You probably know the term “Full HD”? That means that the picture is divided into 1920 pixels horizontally and 1080 pixel vertically. Each potentially with a different color. The latest shit is called 4K which means that you will have — no, not 4000 that would be too easy, right — 3840 pixels horizontally and 2160 pixels vertically.
Why does it matter? Because the more pixels you have the sharper the image will appear. But it also matters on what area size you have those pixels. On a smartphone with only a few inches of screen space, one of those pixels even in “only” Full HD will be pretty tiny and thus the image pretty sharp. On a giant 80 inch TV, those pixels will be much bigger and therefore more noticeable.
The Basics: Refresh Rate
Up until now, everything I explained is true for both still images as well as video. But when it comes to video, refresh rate comes into play. In order to make you think that stuff moves in a picture, a TV will just throw a bunch different still images at you at a specific rate. Your laptop screen probably has a refresh rate of 60 Hz (Hertz) which means that it will throw 60 images per second at you. That is great for stuff like playing fast-paced video games or moving your mouse cursor at high speeds but movies, for example, will be recorded at only 24 fps (frames per second, which in this case is the same as Hz).
Combing things makes stuff hard
Easy until now? Okay, now let’s combine a few things. To watch something we need the following ingredients:
- A device with a screen — your TV
- A playback device — for example a Blu-ray Player, an Apple TV, an Amazon Fire TV, a computer or laptop or something else entirely
- A piece of media we want to watch — for example a Blu-ray disk or a movie off of iTunes, Netflix or Amazon Prime but could also be a game on a PS4 or Xbox
The media basically should dictate how the whole rest of your experience. Why is that? Because it comes with a few assumptions that are hard to be changed correctly (as in so that you don’t get a crappy picture) later in your viewing chain. However, usually your screen-device dictates most of that stuff and the playback device somehow has to marry the media assumptions to the screens and that will go wrong most definitely.
Why will it go wrong? Because there a gazillions settings on each device and (and this is what I would actually ask vendors to implement) there is “automatic” or “just do the right/best thing” switch.
Fuck-up 1: Resolution
Resolution actually is probably the smallest fuck up in my opinion because nowadays media has such a high resolution that it won’t matter much if you do a crappy job on mapping a Full HD image to a 4K image.
Still, vendors could do a much better job of doing this using all kinds of sciency tricks. Let’s take an easy scenario like above. You have an image of 1920x1080 pixels and need to draw it on 3840x2160 pixels. Easy. You will just draw every pixel from the source image twice and … tada .. This is called pixel doubling and actually considered a very bad way of scaling up an image. I will not go into the details but using knowledge about edges and contrast distribution in your image you can do a much better job using other techniques.
Still, at least you will see the full image on your device. Unless … you have a setting for over- or underscan. Those are usually set up correctly, but sometimes they are not. For example, the Nintendo Switch by default had overscan enabled. What is this? It comes from the times when we still had those bulky TVs that were not using exact pixels but a cathode that projected an image to a diffusing screen. Those would normally not exactly display the edges of the screen correctly and therefore the image would be stretched a little so that you could be sure that there are no black bars around the screen.
Why do we still have those settings? I don’t know, but maybe you want to watch something on a really old TV — then this makes sense. Otherwise, make sure to deactivate this setting and have your TV match that setting.
Fuck-up 2: Refresh Rate
Let’s start with why that is already a major fuck up: Because somehow you need to tell your screen to set the refresh rate to 25 or 23,976 (or at least 24) Hz when watching. That does not happen automatically in 99% of cases! If you are watching on your laptop, most likely your screen will still be at 60 Hz.
Why is that a problem? Because your laptop now has the problem of showing 25 still images in one second using 60 images. That doesn’t sound hard but let’s start with the first image of those 25. You show it for 1/60 second. Great. Now for the 2/60 second, what to show? Well, you still should show the first image because 60 divided by 25 is 2.4. That means every frame of those 25 needs the be shown for 2,4 images. Okay, you have shown frame 1 of 25 again. Great. But now the third frame. Technically, the first 40% of that frame needs to still be the first of the 25 and 60% need to be the second of the 25. But you can’t do that. You have to draw a single frame. There are several options to do that and some smart applications use them by mixing the first and second frame somehow but in the end, you will most likely notice a slight “stutter” or some blurriness because the frames are mixed into each other.
Okay, that was a bit hard to follow and you might think: Why is that guy talking about DVDs that’s last century. The unfortunate thing is that with Blu-rays it’s still the same — not the regional thing (in most cases) but the framerate will in all likelihood not match your screens.
So, why don’t all screens just have 24 or 25 Hz? Because that again is bad for fast moving things like your mouse or an action game. While 60 fps is just not what movies are recorded in (and also it will probably look weird because our eyes are used to movies looking a bit “stuttery”).
The only correct solution to this problem is to have a screen that automatically matches the framerate of the source material. If your playback device is a computer or laptop there is a 99% chance this is not happening automatically. If you are using an app on a tablet, there also is 99% chance that this is not happening (to my knowledge, the iPad Pro is a notable exception). If you are watching on an app directly on your TV your chances are a bit better. If you are using a set-top box such as an Apple TV or FireTV stick you probably need to adjust the settings so that this works correctly. Way too much effort on your side!
Fuck-up 3: Color
Oh, color really is what got me into writing this. You would think: RGB like you said above, what’s the big deal? The problem is the following: PC-like devices such as a laptop or a tablet or a gaming console indeed use RGB colors to produce their results but movies come in a different so-called color-space that is called YCbCr. You won’t believe it but this technology dates back to black-and-white TV.
In RGB every pixel lights the different components up with different intensity of Red, Green, and Blue. So to create a color, you need to have three pieces of information: Red-intensity, Green-intensity, Blue-intensity. Let’s assume those values can take any number between 0 and 100%. Now you can create any color. In YCbCr, you also have three pieces of information. The first is called luminance and it tells you “if this picture was black-and-white, use this amount of grey”. The other two pieces of information are blue-difference and red-difference. By applying some basic math you can get about the same color as if you were telling RGB values.
Why is that a problem again? First of all, since your web browser is using RGB to render out something and your movie is using YCbCr there needs to be some kind of conversion again. Fortunately, this time it’s not as complicated as with refresh rate and you won’t lose anything you will be able to spot.
But the most horrid problem comes because someone sometime got the genius idea to define that an amount of 16/256 = 6,25% grey (Y, in YCbCr) should be black. Not 0% as anyone else would have assumed, no 6,25%. And not only with YCbCr, also with RGB. That usually is called “limited” RGB while the assumption that 0% is black is called “full” RGB.
… And that really is a problem because your media, playback device and TV each will have an interpretation of what black is. And if it doesn’t match your picture will look like shit. No, really, even my I-really-don’t-care-you-are-a-nerd-girlfriend can spot that immediately. There are three scenarios basically:
- Your media assumes 0% is black, your TV assumes 6,25% is black
This will give you what is referred to as “crushed blacks”. Your TV will just cut off all grey below 6,25% and display it as black. This means you will lose most shadow detail and generally, your picture will look too dark.
- Your media assumes 6,25% is black, your TV assumes 0% is black
This will give you what is referred to “a washed out look”. Everything will look greyish and it seems that there is not a lot of contrast in the image. Sometimes this can be easily spotted when the “black bars” on top and at the bottom of the screen become grey. For online streams, you will also see a lot of encoding artifacts — areas in the image that appear like they consist of larger boxes.
- Finally, the best case scenario: Your media and TV both assume the same, no matter if 6,25 or 0
This will result in “correct” display of colors and contrast.
The funniest (sad actually, I am being sarcastic here) is that your TV and media and playback device will in all likelihood not switch automatically and there is a high chance that you are watching stuff with one of the bad combinations. Ah, no, the actual most funny thing is that your TV vendor and your playback device vendor will have different names for those two settings. For example, the Apple TV calls “limited RGB” “RGB (high)”. Probably “high” because black is not 0 but a higher number. Now, the LG OLED TVs have a setting called “Black Level” with the option of “high” and “low”. However, a setting of “low” will assume black to 6,25%. See, exactly the opposite of the Apple TV!! How should anyone understand this!?
I don’t think I will write another article about this but there are a few other things that come to my mind that should be done correctly out of the box or automatically that are a bit more advanced. If you are now curious and want to spend the next few weekends going through the settings of all your devices: Color calibration comes to mind, the white point D65, gamma, motion smoothing, and then I could get started on HDR and Dolby Vision which are the newest standards that amazingly work with assumptions that no physical screen currently is able to reproduce.
I spent way too much time getting my “degree” in all this and still, I have to look up if I need to set my Apple TV to RGB low or high. I really don’t advice you do the same unless you want a flawless picture and have some time on your hands.
However, at least set your black level correctly. If you can, use YCbCr 4:4:4 and set your TV to “limited” or “high” or “16–255”). This is probably supported everywhere and therefore the setting with least worries.
I’d like to advise the industry involved in this to make all these settings much more consumer friendly. No one wants to read articles like this one and even less they want to look up all the stuff on the internet.
It should just work.