A Multimodal Tracking Approach For Augmented Reality Applications
PubDate: May 2021
Teams: University of Coimbra；Ingeniarius
Writers: Beril Yalcinkaya；Joao Aguizo；Micael Couceiro；Antonio Figueiredo
Augmented reality (AR) contains a combination of real and virtual elements. The main challenge in this field of research is to achieve a sustainable, accurate registration between the real and computer-generated objects, which depends on obtaining accurate user’s viewpoint, i.e., the position and rotation, or pose, of the user’s camera. Computer vision approaches, namely by resorting to marker-based tracking, is one of the most commonly used techniques in AR to estimate the camera pose. However, it suffers from optical noise, occlusion, and fast motions of the user. In this paper, an Extended Kalman Filter (EKF) is proposed for sensor fusion, combining marker-based tracking with an inertial sensor and an external depth camera for accurate registration. The results show that the EKF can provide a combined estimated pose of the user’s viewpoint data, whether marker-based tracking occurs properly or not, mitigating the aforementioned limitations.