雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Exploring CNN-Based Viewport Prediction for Live Virtual Reality Streaming

Note: We don't have the ability to review paper

PubDate: December 2019

Teams: Rutgers University

Writers: Xianglong Feng; Zeyang Bao; Sheng Wei

PDF: Exploring CNN-Based Viewport Prediction for Live Virtual Reality Streaming

Abstract

Live virtual reality streaming (a.k.a., 360-degree video streaming) is gaining popularity recently with its rapid growth in the consumer market. However, the huge bandwidth required by delivering the 360-degree frames becomes the bottleneck, keeping this application from a wider range of deployment. Research efforts have been carried out to solve the bandwidth problem by predicting the user’s viewport of interest and selectively streaming a part of the whole frame. However, currently most of the viewport prediction approaches cannot address the unique challenges in the live streaming scenario, where there is no historical user or video traces to build the prediction model. In this paper, we explore the opportunity of leveraging convolutional neural network (CNN) to predict the user’s viewport in live streaming by modifying the workflow of the CNN application and the training/testing process. The evaluation results reveal that the CNN-based method could achieve a high prediction accuracy with low bandwidth usage and low timing overhead.

您可能还喜欢...

Paper