I decided to ask /u/stevenrobertson (youtube engineer) why are AV1 videos using such high bitrates, here is hour exchange:
Me:
Hi, I’m a moderator of r/AV1 and community noticed a weird trend with AV1 videos on youtube. The whole point of AV1 was to deliver better quality for smaller data streams but apparently it’s not the case right now.
AV1 files have larger bitrates than VP9 and we are somewhat confused by this. Do you have any information that you could share with the community, that would help us understand how youtube is planning to use AV1?
stevenrobertson:
Hey Dominic! AV1 is expected to save YouTube and its users a ton of bandwidth over the next decade or so. Our goal today is entirely around driving adoption, making sure hardware decoders land where they need to and are reliable and well tested. We’re explicitly targeting fidelity levels that are higher than VP9 for now as part of driving that adoption, making sure decoders are fast and reliable and making sure AV1 always looks great. This is still just the start of the AV1 rollout. HTH!
Huh. I’m not sure I follow how some of those things connect to each other. Is it significantly lighter and more stable to decode high bitrate AV1 than low bitrate for the same resolution? Or is he just saying they’re pushing quality as a move to provide a user-readable improvement and the rest is just PR speak?
You have to remember this is post from like 2-3 years ago. Back then the biggest issue was compatibility and having a AV1 video which used less features and was less efficient was easier to play. As for how it is today, the situation has vastly improved, but I am a bit rusty on this subject.
Ah, I was so happy to hear that, I thought they raised the quality. Nowadays it seems av1 is just much lower quality. For example one video I just tested, av1 549 MiB, best vp9 1009 MiB, so 54%.
Unless av1 is the only option at that resolution, fps, etc., vp9 has usually better quality on youtube
I don’t think this is the case anymore.