空 挡 广 告 位 | 空 挡 广 告 位

360-degree video stitching for dual-fisheye lens cameras based on rigid moving least squares

Note: We don't have the ability to review paper

PubDate: February 2018

Teams: University of Texas-Arlington

Writers: Tuan Ho ; Ioannis D. Schizas ; K. R. Rao ; Madhukar Budagavi

PDF: 360-degree video stitching for dual-fisheye lens cameras based on rigid moving least squares

Abstract

Dual-fisheye lens cameras are becoming popular for 360-degree video capture, especially for User-generated content (UGC), since they are affordable and portable. Images generated by the dual-fisheye cameras have limited overlap and hence require non-conventional stitching techniques to produce high-quality 360×180-degree panoramas. This paper introduces a novel method to align these images using interpolation grids based on rigid moving least squares. Furthermore, jitter is the critical issue arising when one applies the image-based stitching algorithms to video. It stems from the unconstrained movement of stitching boundary from one frame to another. Therefore, we also propose a new algorithm to maintain the temporal coherence of stitching boundary to provide jitter-free 360-degree videos. Results show that the method proposed in this paper can produce higher quality stitched images and videos than prior work.

您可能还喜欢...

Paper