Is there a point getting over 4K?
Some time ago I’ve answered this question on Quora. Since it had got some attention, I thought to re-post this as blog post, with some improvements.
Enjoy the reading!
To answer this question is important to start thinking at where your media is going to be seen/used.
Video for online streaming
In this case, let’s check first how many people have a screen that goes beyond 4k.
A w3schools.com research reveals that nowadays (Jan 2018), only 2% of the screens used to browse online have a 4K resolution and 5.6% is a mix of other high resolutions (I suppose some higher than 4K and the others just being non-standard).
Another relevant thing to keep in mind when talking about streaming videos is bandwidth. Imagine the download speed needed to stream an 8K video. In many places is still difficult to stream a 1080p and YouTube has to offer a ‘video quality' option to reduce the data to stream on slow connections.
Also, considering that paid streaming services like Netflix, Amazon and VUDU are not offering more than 4K, clearly means that there is still no point going beyond that dimension for this use case.
Assets for video editing
In this case there is a real benefit from shooting at more than 4K. If working on a project that aim to render and export a 4K video, it’s a good idea to have assets bigger than the final video. That’ll give the editors the flexibility to crop and zoom, without lowering the final project resolution.
No questions then, if it’s an asset, go big!
Video for cinema (and other big screens)
Ah, the big screen! Well, as we know some cinema screens are already supporting 4K resolutions (see IMAX — Wikipedia) but that’s the highest resolution we’ve got so far in a cinema movie (except for Guardians of the Galaxy: II).
(for more see: High-resolution cinema: 4K, 8K and beyond)
Then, why cinemas haven’t got bigger?
Well, surely not because of a lack of money from the cinema market. The answer is right in front of your eyes…actually, it IS your eyes!
The human eye resolution
The human eye is absolutely key in this discussion as unfortunately it has limits.
Although the theoretical limits of our eye resolution are huge, we are talking about 324 ‘megapixels’ (read: ClarkVision.com — the eye resolution), what really matters to answer our question is visual acuity.
As specified in Wikipedia:
Visual acuity is a measure of the spatial resolution of the visual processing system.
And in a BBC article we find:
So it doesn’t really matter if our eyes are wonderful 324 Mp cameras, we have a limit, that’s why we can’t see microscopic things like bacteria. We actually struggle with the small one-centimetre-tall letters that opticians ask us to read to test our sight.
At some resolutions, cameras record more details than we can discern.
There is also a distance at which resolution becomes either noticeable or optimal for the viewer, in relation to the screen size. Here is a nice graph to calculate it:
See Why Ultra HD 4K TVs are still stupid and 4K Resolution Does Matter — Here’s When
Conclusion
If your goal is to stream a video online, 4K already represent a challenge for many internet services.
If you’re making assets for videos, going over 4K can be a good idea as it will give flexibility in editing.
For other medias like cinema and television, even though screens could and will reach ridiculously high definition, our eyes will necessarily put a limit to the human visual experience. This doesn’t mean that we shouldn’t have screen with higher resolutions than 4K, but that we should increase the size of the screen along with the amount of megapixels to adapt to the human eye limits.