Detecting Touch and Grasp Gestures Using a Wrist-Worn Optical and Inertial Sensing Network
PubDate: July 2022
Teams: Stanford University
Writers: Savannah Cofer; Tyler N. Chen; Jackie Junrui Yang; Sean Follmer
PDF: Detecting Touch and Grasp Gestures Using a Wrist-Worn Optical and Inertial Sensing Network
Abstract
Freehand gesture-based interaction promises to enable rich interaction in applications such as augmented reality, virtual reality, human-robot interaction, and robotic prosthetic devices. However, current sensing approaches are limited to mid-air whole hand gestures and fail to identify small-scale tactile interactions with unsensorized environments. Detecting tactile interactions may unlock potential new applications in augmented reality and human-robot interaction in which unsensorized surfaces are used as touch input devices. This work presents a novel wrist-worn sensing device that combines near-infrared and inertial measurement unit sensing to enable high-accuracy detection of surface touch and grasp interactions. Two convolutional neural networks were used to map device inputs to detect touch events, and classify them by gesture type or direction. We evaluated the accuracy and temporal precision of our system for event detection and classification. Results from an in-lab user study of 12 participants showed an average of 97% touch detection accuracy and 98% grasp detection accuracy. In our study, we found that near-infrared and inertial sensing are complementary and can be used in tandem to effectively address both touch event detection and directionality classification.