So I was wondering the bandwidth usage of Owncast and started crunching some numbers. Then I happened to find that @[email protected] had already done the work. Article linked, but here’s one of the examples:
Example Scenario#
You’ve configured your broadcasting source (such as OBS) to stream to your Owncast instance at 5000kbps. You have 25 viewers. 5 of them are on slow or mobile networks, 17 of them have fast, stable internet, and 3 of them have fast internet most of the time but the speed fluctuates. All 25 viewers watched an entire stream that lasts two hours. You have a hosting provider that gives you 4TB of bandwidth per month.
Offer a high and low quality option
You decide to offer both a high and low quality option, and you set the high quality option to 5000kbps and the low quality option to 1500kbps.
How much bandwidth is used on your server for this stream?
Bitrate | Duration | Viewers | Total |
---|---|---|---|
0.000625 Gigabytes per second (5000kbps) | 7200 seconds | 19 | 85 Gigabytes |
0.0001875 Gigabytes per second (1500kbps) | 7200 seconds | 6 | 8.1 Gigabytes |
Total: 93.1 Gigabytes |
How much CPU?
Quality | CPU Usage |
---|---|
5000kbps | Some (It matches the input) |
1500kbps | More (CPU needs to be used to compress the video) |
How is the viewer experience?
Quality | Viewers | Experience |
---|---|---|
5000kbps | 20 | Good |
1500kbps | 5 | Good |
Result: You’ve provided both a high and low quality option for your viewers so those with a slow network have an option, and those with a fast network that might periodically slow down can dip down into the low quality when needed. Additionally, in this case you saved almost 20G of bandwidth traffic due to offering a lower quality. You’re using more CPU for a much better experience. You would be able to stream 43 times in a month before you hit your bandwidth limit.