Interactive Mobile Augmented Reality System for Image and Hand Motion Tracking

Note: We don't have the ability to review paper

PubDate: August 2018

Teams: National Chiao Tung University;National Taipei University of Technology

Writers: Pei-Hsuan Chiu; Po-Hsuan Tseng; Kai-Ten Feng

PDF: Interactive Mobile Augmented Reality System for Image and Hand Motion Tracking


In recent years, augmented reality (AR) is considered a promising technology that combines virtual information such as videos, images, and three-dimensional objects with a real camera view in mobile platforms. Interactive AR further provides human-computer interaction to allow the user to interact with virtual objects on the mobile display. In this paper, we proposed a cloud-based mobile augmented reality interactive system (MARIS), which includes MARIS-I for image target tracking and MARIS-H for hand motion tracking. MARIS-I estimates the position of the image target by adopting a feature-based mean-shift algorithm, which is feasible for real-time applications with its small region feature detection. MARIS-H provides two tracking modes for fingertip and back of hand tracking to enhance user experiences (UX) for interaction. Either the center position for the back of hand or fingertip is first estimated by particle filtering technique, which calculates the weighting of each particle according to hand or fingertip model. Afterward, the contour of the fingertip is estimated by level-set-based contour evolution in the fingertip tracking mode. Furthermore, we implement a device/cloud architecture for the proposed MARIS to decrease memory requirement and computational complexity on the device side. Experimental results show that MARIS including MARIS-I and MARIS-H can outperform other existing methods for image and hand motion tracking, respectively. The proposed MARIS is demonstrated in a picture book to provide fruitful interactive UX for digital learning systems.