AdaP-360: User-Adaptive Area-of-Focus Projections for Bandwidth-Efficient 360-Degree Video Streaming
Abstract: 360-degree video is an emerging medium that presents an immersive view of the environment to the user. Despite its potential to provide an immersive watching experience, 360-degree video has not achieved widespread popularity. A significant cause of this slow adoption is the high-bandwidth requirements of the format. The primary source of bandwidth inefficiency in 360-degree video streaming, un-addressed in popular transmission methods, is the discrepancy between the pixels sent over the network (typically the full omnidirectional view) and the pixels displayed in the head-mounted display's field of view. At worst, roughly 88% of transmitted pixels remain unviewed. In this work, we motivate a user-adaptive approach to address inefficiencies in 360-degree streaming through an analysis of user-viewing traces. We design a greedy algorithm to generate projections of the spherical surface that allow the user-view trajectories to be efficiently transmitted. We further demonstrate that our approach can be applied to many popular 360-degree projection layouts. In BD-rate experiments, we show that the adaptive versions of the rotated spherical projection (RSP) and equi-angular cubemap (EAC) can save 26.2% and 24.0% bitrates on average, respectively, while achieving the same visual quality of rendered views compared to their non-adaptive counterparts in a realistic scenario. These adaptive projections can also achieve 53.1% bandwidth savings over the equirectangular projection.
0 Replies
Loading