View-aware tile-based adaptations in 360 virtual reality video streaming
PubDate: April 2017
Teams: University of Illinois at Urbana-Champaign
Writers: Mohammad Hosseini
PDF: View-aware tile-based adaptations in 360 virtual reality video streaming
Abstract
We have proposed an adaptive view-aware bandwidth-efficient 360 VR video streaming framework based on the tiling features of MPEG-DASH SRD. We extend MPEG-DASH SRD to the 3D space of 360 VR videos, and showcase a dynamic view-aware adaptation technique to tackle the high bandwidth demands of streaming 360 VR videos to wireless VR headsets. As a part of our contributions, we spatially partition the underlying 3D mesh into multiple 3D sub-meshes, and construct an efficient 3D geometry mesh called hexaface sphere to optimally represent tiled 360 VR videos in the 3D space. We then spatially divide the 360 videos into multiple tiles while encoding and packaging, use MPEG-DASH SRD to describe the spatial relationship of tiles in the 3D space, and prioritize the tiles in the Field of View (FoV) for view-aware adaptation. The initial evaluations that we conducted show that we can save up to 72% of the required bandwidth on 360 VR video streaming with minor negative quality impacts compared to the baseline scenario when no adaptations is applied.