TCP Congestion Response for Low Latency HTTP Live Streaming

Published: 01 Jan 2018, Last Modified: 13 May 2025WOWMOM 2018EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Adaptive bit rate streaming using HTTP delivery is now widely used to provide live audio-visual content services. However, the quality of experience is often considered inferior to that of conventional broadcast services due to the high end to end latency, which is mostly due to the large amount of buffering required to provide resilience to variable network conditions. In this paper we present our initial research to address this issue of high latency. If each segment of content could be delivered in a consistent period of time, the amount of buffering required could be reduced. We present the results of simulations where we have changed the timing of the TCP congestion response to consider the timing requirements of content segments, while trying to retain fairness to competing flows. We show that with this simple modification to TCP the variation in the delivery time of content segments can be reduced and much lower end to end latency achieved. With no changes to the network or client devices required, this solution may be straightforward to deploy to make the user experience of streaming services much closer to that of conventional broadcast services.
Loading