Event-driven stitching for tile-based live 360 video streaming
PubDate: June 2019
Teams: University of Illinois at Urbana
Writers: Bo Chen;Zhisheng Yan;Haiming Jin;Klara Nahrstedt
PDF: Event-driven stitching for tile-based live 360 video streaming
Abstract
360 video streaming is gaining popularity because of the new type of experience it creates. Tile-based approaches have been widely used in VoD 360 video streaming to save the network bandwidth. However, they cannot be extended to the case of live streaming because they assume the 360 videos stitched offline before streaming. Instead, stitching has to be done in real-time in live 360 video streaming. More importantly, the stitching speed as shown in our experiments is one order of magnitude lower than the network transmission speed, making stitching more of a deciding factor of the overall frame rate than the network transmission speed. In this paper, we design a stitching algorithm for tile-based live 360 video streaming that adapts stitching quality to make the best use of the timing budget. There are two main challenges. First, existing tile-based approaches do not consider various semantic information in different scenarios. Second, the decision of tiling schemes for tile-based stitching is non-trivial. To solve the above two challenges, we present an event-driven stitching algorithm for tile-based 360 video live streaming, which consists of such an event-driven model to abstract various semantic information as events and a tile actuator to make tiling scheme decisions. We implement a streaming system based on event-driven stitching called LiveTexture. To evaluate the proposed algorithm, we compare LiveTexture with other baseline systems and show that LiveTexture adapts well to various timing budgets by meeting 89.4% of the timing constraints. We also demonstrate that LiveTexture utilizes the timing budget more efficiently than others.