A Large-Scale Indoor Layout Reconstruction and Localization System for Spatial-Aware Mobile AR Applications
PubDate: December 2021
Teams: National Tsing Hua University;National Taiwan University of Science and Technology
Writers: Kai-Wen Hsiao; Jheng-Wei Su; Yu-Chih Hung; Kuo-Wei Chen; Chih-Yuan Yao; Hung-Kuo Chu
Abstract
Augmented reality (AR) has renewed the perspective of how people see and process information in daily life by blending the real world with rich and interactive virtual contents. However, restricted by the limited computation resources of mobile devices, most existing AR applications can process only local features in front of the camera. Such limitation severely hinders the progress of inventing more practical AR applications that demand more complex interaction. To break this barrier, we introduce a novel spatial-aware AR application framework for the indoor environment in this work. The core techniques include a large-scale indoor layout reconstruction system that can reconstruct the 3D layout of an entire floor plan from few panoramas taken via commodity cameras, and an online localization system that can accurately align the virtual 3D layout with real world environment in real time. By incorporating the perception of global scene structure with the AR interaction pipeline, our system can support more sophisticated applications with the complex interplay among the users, virtual objects and the real world scene. We tested our framework on a large-scale indoor environment and demonstrate its effectiveness on practical AR applications, a first-person-shooting game and an indoor navigation.