This post is not available in your language. Here are some other options:
Since 2017, Twitch has been delivering interactive experiences to millions of viewers through low-latency streaming. Low-latency streaming is an evolution of web streaming technology that brings the stream delay from creator to viewer down from over 30 seconds to just a few seconds.
The impact of this has been enormous: viewers are more engaged with their favorite streamers when they can chat in near real-time, and new streaming experiences have been developed around this newfound interactivity.
However, our data shows that for some viewers, the quality of service during a low-latency stream is not as good as a regular-latency stream. They buffer more frequently, and may see a lower quality of video. The root cause of this issue lies in the ABR algorithm, or adaptive bitrate switching, which hasn’t yet evolved to the low-latency streaming world.
ABR is one of the key components of a video player. It’s responsible for measuring the speed of your connection and choosing which quality should be playing. When Twitch viewers select the “Auto” quality, ABR is constantly working to minimize buffering while maximizing the quality you see.
Traditionally, ABR works by timing the download of each media segment and using this timing to derive an estimate of your bandwidth. For example, if your browser can download a1Mb segment in 1 second, your estimate is 1Mb/s. The player then chooses the highest quality that can be reliably played with this estimate.
However, with low-latency streaming we can’t accurately time the length of a segment download. This is because we’re streaming the video to you in real-time, directly from the source. In other words, it takes 1 second for us to download 1 second of video, regardless of the quality or connection speed (assuming your connection is fast enough to download in real-time). This means that we can’t rely on our regular-latency ABR algorithm to provide a good experience.
While our low-latency ABR approach works well, it’s not as good as our regular latency algorithm, especially given the new challenges added playing low-latency streams, such as a smaller amount of time to respond to deteriorating network conditions. As a result, viewers watching low-latency streams often rebuffer more frequently, and spend longer time viewing a quality lower than one their connection can reliably support.
This isn’t just a problem we’re facing at Twitch. All companies utilizing HTTP low-latency streaming, such as Periscope and Apple, face the same issue. The lack of a great ABR algorithm is one of the main factors preventing low-latency HTTP streaming from reaching widespread adoption, and from reducing latency even lower than what we’re at today.
To work towards a solution, we’ve partnered with ACM and are calling for developers and researchers to create new ABR algorithms tailored towards low-latency streaming. Cash prizes are on the line for the best two submissions! For more information and a detailed technical overview, please see https://2020.acmmmsys.org/lll_challenge.php.