An Evaluation of Viewport Estimation Methods in 360-Degree Video Streaming
PubDate: June 2022
Teams: Tohoku Institute of Technology
Writers: Duc Nguyen
PDF: An Evaluation of Viewport Estimation Methods in 360-Degree Video Streaming
Abstract
360-degree video is an integral part of Virtual Reality systems. However, transmission of 360 video over the network is challenging due to its large size. To reduce the network bandwidth requirement of 360-degree video, Viewport Adaptive Streaming (VAS) has been proposed. A key issue in VAS is how to estimate future user viewing directions. In this paper, we carry out an evaluation of typical viewport estimation methods for VAS. It is found that Long-Short Term Memory (LSTM)based method achieves the best trade-off between the accuracy and redundancy. Using cross-user behaviors achieve the highest accuracy at the expense of high redundancy. Meanwhile, the widely used linear regression-based method has performance comparable to that of the simple method using the last viewport position. In addition, we also found that all considered methods suffer significant degradation in performance when the prediction horizon increases beyond 1 second.