Quantcast
Channel: VideoHelp Forum
Viewing all articles
Browse latest Browse all 45504

I there any benefit to encoding to 10-bits?

$
0
0
This question has been asked, and in theory, answered numerous times, with the topic usually popping up as it related to x264 vs x264 10-bit. The generally accepted answer is that even with an 8-bit source there are benefits to encoding to 10-bit because, the argument goes, of smoother color gradients due to higher precision math. This seems like a reasonably conclusion, and in fact I admit I am guilty of offering this advice in the past.

However, a recent discussion with a sparing partner on this forum led me to start rethinking this position, especially as i watched him tap dance around the offer he made and I accepted only for him to back out of with silly objections.

One of the objections revolved around the source I proposed for his test and his claims that since some of the encoders to be tested support 10-bit encoding and some don't, that it would unfairly penalize the 8-bit encoders by using a 422 10-bit source.

Despite mocking him, I spent some time thinking about this and no matter with way I look at it, I can't escape one very simple reality, it probably doesn't matter whether or not you encode to 8-bit or 10-bit because the final encode will be viewed in nearly all cases on an 8-bit monitor, which means that the 10-bit encoding needs to somehow be mapped to an 8-bit display.

In order to display a true 10-bit image, you need a video card, drivers and monitor that support true 10-bit output. But most monitors only support 8-bit, some support a pseudo 10-bit output, and as far as I know there are no consumer tv's that currently support 10-bit output.

Further compounding my skepticism is that some, if not all, non-pro encoding front ends, such as handbrake, despite being able to output a 10-bit or 10/12-bit video (in the case of x265), the internal processing is still done in 420 8-bit.

So I decided to perform a test, using the source named BMPCC6K_Wedding.mov, from here:

https://www.blackmagicdesign.com/products/blackmagicpocketcinemacamera/workflow

The source is ProRes, 1034 Mb/s, 6144x3456, 24fps, 422 and I used the latest build of Handbrake on Manjaro, with all the updates.

When you load this in Handbrake, it automatically crops it 448 top and bottom to 6144x2560 and I outputted it as 1920x1080 storage resolution, 2592x1080 2.40:1 display resolution.

I considered outputting it at 6k, but the reality is that most people don't even have a 4k monitor, much less a 6k monitor, at 1080p, it should give us a realistic idea of the benefit to an end user in encoding to bit depths higher than 8-bit.

As a side note, for $2500, this camera is one hell of a value, 13 stops, capable of recording to RAW or ProRes, full 6k resolution, if I were running a streaming site, with unique content, I would author and stream in 5k as a way of differentiating my site from competitors. Fun fact, there is an "adult" site that sells it's movies at resolutions up to 5k, though I strongly suspect it's 4k that's been upscaled.

What I did was encode the first one to x264 with CRF 22, then did a 2 pass encode for x264 10-bit and x265 8/10/12-bit. As you will note, in some case the encoder missed the target bit rate slightly.

Just for fun, I did a few higher quality encodes as well, at 6k, with x264 CRF 16 and i tried to match the bit rate with NVENC HEVC but it fell short of the target bit rate by a bit, I will be uploading those shortly as there seems to be a problem with the forum accepting those uploads at the moment.

If anyone has access to any commercial encoders, Main Concept, Sony AVC, Apple, Ateme, or a Turing card and wants to join in the fun, feel free to run your own test.

Viewing all articles
Browse latest Browse all 45504

Trending Articles