空 挡 广 告 位 | 空 挡 广 告 位

360ViewPET: View Based Pose EsTimation for Ultra-Sparse 360-Degree Cameras

Note: We don't have the ability to review paper

PubDate: January 2022

Teams: University of Illinois Urbana-Champaign;

Writers: Qian Zhou; Bo Chen; Zhe Yang; Hongpeng Guo; Klara Nahrstedt

PDF: 360ViewPET: View Based Pose EsTimation for Ultra-Sparse 360-Degree Cameras

Abstract

Immersive virtual tours based on 360-degree cameras, showing famous outdoor scenery, are becoming more and more desirable due to travel costs, pandemics and other constraints. To feel immersive, a user must receive the view accurately corresponding to her position and orientation in the virtual space when she moves inside, and this requires cameras’ orientations to be known. Outdoor tour contexts have numerous, ultra-sparse cameras deployed across a wide area, making camera pose estimation challenging. As a result, pose estimation techniques like SLAM, which require mobile or dense cameras, are not applicable. In this paper we present a novel strategy called 360ViewPET, which automatically estimates the relative poses of two stationary, ultra-sparse (15 meters apart) 360-degree cameras using one equirectangular image taken by each camera. Our experiments show that it achieves accurate pose estimation, with a mean error as low as 0.9 degree.

您可能还喜欢...

Paper