A Multimodal Tracking Approach For Augmented Reality Applications

Note: We don't have the ability to review paper

PubDate: May 2021

Teams: University of Coimbra;Ingeniarius

Writers: Beril Yalcinkaya;Joao Aguizo;Micael Couceiro;Antonio Figueiredo

PDF: A Multimodal Tracking Approach For Augmented Reality Applications


Augmented reality (AR) contains a combination of real and virtual elements. The main challenge in this field of research is to achieve a sustainable, accurate registration between the real and computer-generated objects, which depends on obtaining accurate user’s viewpoint, i.e., the position and rotation, or pose, of the user’s camera. Computer vision approaches, namely by resorting to marker-based tracking, is one of the most commonly used techniques in AR to estimate the camera pose. However, it suffers from optical noise, occlusion, and fast motions of the user. In this paper, an Extended Kalman Filter (EKF) is proposed for sensor fusion, combining marker-based tracking with an inertial sensor and an external depth camera for accurate registration. The results show that the EKF can provide a combined estimated pose of the user’s viewpoint data, whether marker-based tracking occurs properly or not, mitigating the aforementioned limitations.

You may also like...