A Robust Method for Hands Gesture Recognition from Egocentric Depth Sensor
PubDate: May 2019
Teams: Beihang University;Beihang University Qingdao Research Institute
Writers: Ye Bai; Yue Qi
PDF: A Robust Method for Hands Gesture Recognition from Egocentric Depth Sensor
Abstract
We present a method for robust and accurate hand pose recognition from egocentric depth cameras. Our method combines CNN based hand pose estimation and joint locations based hand gesture recognition. In pose estimation stage, we use a hand geometry prior network to estimate the hand pose. In gesture recognition stage, we defined a hand language which based on a set of pre-define basic propositions, obtained by applying four predicate types to the fingers and palm states. The hand language is used to convert the estimated joint location to hand gesture. Our experimental results indicate that the method enables robust and accurate gesture recognition in self-occlusion environment.