This article attempts to answer two questions:
- How much resolution is lost in the compression stage?
- Does shooting at a higher resolution and then downsampling help mitigate this loss?
The answer to the first question should be pretty obvious, as anyone who has compressed video will know. However, the answer to the second question can only be answered faithfully if we have an idea about the degree of the loss in the first.
Understanding the limits of resolution loss will help us greatly when it is time to deliver the best quality work to our clients or customers.
In order to eliminate lens effects, a vector image (8-bit) was created in two versions:
- 1920 x 1080
- 3840 x 2160
Animation was applied to the test images to replicate resolution loss in motion. However, no motion blur was applied because any loss by motion blur will scale proportionately. Remember, this test is about resolution, not sharpness. The software used was Adobe After Effects CC.
The first composition is pure 1080p. The second is 4K, downscaled to 1080p (nothing fancy, just using the Scale function in AE). Both compositions were rendered to the following formats for testing:
- Uncompressed TIFF (image sequence)
- Prores HQ (Quicktime, QT)
- H.264 50 Mbps QT
- H.264 20 Mbps QT
- H.264 10 Mbps QT
- H.264 5 Mbps QT
The first is my favorite method of archival. The second is the best intermediary (and nowadays acquisition as well) codec. The third is how DSLRs record data, while the last is how sites like Youtube and Vimeo display 1080p content. Don’t forget, these sites compress your already compressed video, which makes things a lot worse. Our goal is to find a formula so we can set limits.
Here’s what a sample frame looks like:
While producing the same vector images, I had to take into account the increased resolution of a 4K sensor while keeping the same area. This means, if a hair is one pixel wide in 1080p, it will be 2 pixels wide on the same sensor at 4K, assuming everything else is constant. This is why 4K has more resolution, though whether or not this carries forward while downsampling remains to be seen.
Two frames from each video were exported in TIFF. These were further compressed to JPEGs for the purposes of this article. However, the original TIFF frames are available for download here, should you wish to study them personally.
How much resolution do you lose with compression?
To arrive at a numerical percentage for resolution loss, I decided to use lines and circles with different pixel widths. It then becomes the matter of discovering how much the line or circle has ‘broken’ – which is a matter of counting.
Here are the results:
|Image||Format||Width (1*)||Width (4*)||Width (16*)||Diff. %|
|H.264 50 Mbps||4||8||32||88%|
|H.264 20 Mbps||7||8||35||77%|
|H.264 10 Mbps||10||10||35||67%|
|H.264 5 Mbps||25**||11||38||58%|
|H.264 50 Mbps||2||4||17||100%|
|H.264 20 Mbps||4||7||18||70%|
|H.264 10 Mbps||4||12||20||57%|
|H.264 5 Mbps||28**||10||20||51%|
- *These are the original widths, even though the lines were 1 px, 4px and 16px. In case of diagonal lines, it is better to not consider the absolute values because the software interpolates them to fill in the gaps. For relative percentages, use both. For absolute percentages, use the vertical measure.
- **At this point large swathes of the line were missing. I’ve counted the largest gap rather than the width.
Now there’s a surprise. If you’re looking for fine detail, a 5 Mbps video loses about 10 times the information. On average, it loses at least twice the amount of resolution, as a conservative estimate.
What does this mean? It means that if you’re shooting 1080p and putting it up on Youtube, what you’re seeing is standard definition TV. In fact, for 1080p video, going by what we know about resolution and what we saw in the limits of upsampling, we can safely say that any data rate below 20 Mbps for 1080p is unacceptable – if you’re a fan of image quality.
The results also goes to show how important it is for an acquisition format to go beyond 50 Mbps, which is why it is mandatory for broadcast quality. Here’s one reason why AVCHD sucks, with its 28 Mbps bit rate. Just in case it isn’t obvious, the results also include the effects of chroma subsampling, which is unavoidable for these codecs.
Okay, let’s move on to the second part of this test.
Results of downsampling 4K to 1080p – does it help?
The background image should not be used for comparisons in this second test. Here are the results:
|Image||Format||Width (1)||Width (4)||Width (16)||Diff. %|
|H.264 50 Mbps||4||10||32||89%|
|H.264 20 Mbps||8||10||32||79%|
|H.264 10 Mbps||8||10||33||78%|
|H.264 5 Mbps||19||12||34||66%|
|H.264 50 Mbps||3||5||17||93%|
|H.264 20 Mbps||4||5||18||87%|
|H.264 10 Mbps||8||8||20||64%|
|H.264 5 Mbps||20||8||20||60%|
It must come as no surprise that downsampling 4K to 1080p follows a similar pattern, with one notable exception:
When things are great (at the higher data rates), it hardly makes a difference if you downsample or not. After all, resolution is resolution. However, when you compress more, 4K footage downsampled tends to hold better detail than 1080p. By how much? Not much really, but let me put it this way: If the limit to compression was 20 Mbps for 1080p, it is only 10 Mbps for 4K downsampled.
This means, for Internet video streaming at less than 10 Mbps, it is always better to shoot 4K and downsample to 1080p – no exceptions.
- If you’re shooting 1080p, never compress below 20 Mbps, or you’re effectively watching standard definition.
- For Internet video, shoot 4K and downsample always.
What do you think? Does this match your experience?