Sign Language Recognition in Virtual Reality
PubDate: January 2021
Teams: University of South Florida
Writers: Jacob Schioppo; Zachary Meyer; Diego Fabiano; Shaun Canavan
A real-time system for signal language recognition in virtual reality (VR) is presented in this paper. The system makes use of an egocentric view with the Vive HTC VR headset along with a Leap Motion controller. In this demo, a random forest is used to classify the 26 letters of the alphabet, in American Sign Language, from hand-crafted features extracted from the Leap Motion controller. We detail offline classification results showing the expressive power of the features used for recognition.