Real-Time 3D Hand Gesture Based Mobile Interaction Interface
PubDate: January 2020
Teams: State Key Laboratory of Virtual Reality Technology and Systems;Shenzhen Qingdao Research Institute of Beihang University
Writers: Yunlong Che; Yue Qi; Yuxiang Song
PDF: Real-Time 3D Hand Gesture Based Mobile Interaction Interface
Abstract
Hand gesture recognition is a challenging problem for natural human-computer interaction(HCI). We address this problem by introducing a real-time human-mobile interaction interface with a depth sensor. Our interface consists of two components, 3D hand pose estimation and hand skeleton state based gesture description. Firstly, we propose a 3D hand pose estimation method that combines learning based pose initialization and physical based model fitting, which can estimate the per-frame’s hand pose that appears in the depth camera’s field of view. Afterwards, we map the estimated pose to gesture, e.g. open or close, through a hand skeleton state based method. With the tracked hand gesture, we can stably and smoothly implement common operations such as ‘Touch’, ‘Grasp’ and ‘Hold’ with mid-air interface. Our main contribution is combine 3D hand pose estimation and hand gesture tracking, and implementing an interaction application system with the details.