Attentive Deep Stitching and Quality Assessment for 360∘ Omnidirectional Images
PubDate: November 2019
Teams: Beihang University；Chinese Academy of Sciences
Writers: Jia Li; Yifan Zhao; Weihua Ye; Kaiwen Yu; Shiming Ge
360° omnidirectional images are very helpful in creating immersive multimedia contents, which enables a huge demand in their efficient generation and effective assessment. In this paper, we leverage an attentive idea to meet this demand by addressing two concerns: how to generate a good omnidirectional image in a fast and robust way and what is a good omnidirectional image for human. To this end, we propose an attentive deep stitching approach to facilitate the efficient generation of omnidirectional images, which is composed of two modules. The low-resolution deformation module aims to learn the deformation rules from dual-fisheye to omnidirectional images with joint implicit and explicit attention mechanisms, while the high-resolution recurrence module enhances the resolution of stitching results with the high-resolution guidance in a recurrent manner. In this way, the stitching approach can efficiently generate high-resolution omnidirectional images that are highly consistent with human immersive experiences. Beyond the efficient generation, we further present an attention-driven omnidirectional image quality assessment (IQA) method which uses joint evaluation with both global and local metrics. Especially, the local metric mainly focuses on the stitching region and attention region that mostly affect the Mean Opinion Score (MOS), leading to a consistent evaluation of human perception. To verify the effectiveness of our proposed assessment and stitching approaches, we construct a hybrid benchmark evaluation with 7 stitching models and 8 IQA metrics. Qualitative and quantitative experiments show our stitching approach generate preferable results with the state-of-the-art models at a 6× faster speed and the proposed quality assessment approach surpasses other methods by a large margin and is highly consistent with human subjective evaluations.