Sign Language Recognition: Learning American Sign Language in a Virtual Environment
PubDate: May 201
Teams: University of South Florida
Writers: Jacob Schioppo;Zachary Meyer;Diego Fabiano;Shaun Canavan
PDF: Sign Language Recognition: Learning American Sign Language in a Virtual Environment
Abstract
In this paper, we propose an approach, for sign language recognition, that makes use of a virtual reality headset to create an immersive environment. We show how features from data acquired by the Leap Motion controller, using an egocentric view, can be used to automatically recognize a user signed gesture. The Leap features are used along with a random forest for real-time classification of the user’s gesture. We further analyze which of these features are most important, in an egocentric view, for gesture recognition. To test the efficacy of our proposed approach, we test on the 26 letters of the alphabet in American Sign Language in a virtual environment with an application for learning sign language.