VASTile: Viewport Adaptive Scalable 360-Degree Video Frame Tiling
PubDate: October 2021
Teams: The University of Sydney
Writers: Chamara Madarasingha;Kanchana Thilakarathna
360° videos a.k.a. spherical videos are getting popular among users nevertheless, omnidirectional view of these videos demands high bandwidth and processing power at the end devices. Recently proposed viewport aware streaming mechanisms can reduce the amount of data transmitted by streaming a limited portion of the frame covering the current user viewport (VP). However, they still suffer from sending a high amount of redundant data, as the fixed tile mechanisms can not provide a finer granularity to the user VP. Though, making the tiles smaller can provide a finer granularity for user viewport, it will significantly increase encoding-decoding overhead. To overcome this trade-off, in this paper, we present a computational geometric approach based adaptive tiling mechanism named VASTile, which takes visual attention information on a 360° video frame as the input and provides a suitable non-overlapping variable size tile cover on the frame. Experimental results show that VASTile can save up to 31.1% of pixel redundancy before compression and 35.4% of bandwidth saving compared to recently proposed fixed tile configurations, providing tile schemes within 0.98 (±0.11) seconds time frame.