Sensor Fusion for Learning-based Tracking of Controller Movement in Virtual Reality
Title: Sensor Fusion for Learning-based Tracking of Controller Movement in Virtual Reality
Teams: Microsoft
Writers: Chen Song Shuayb Zarar
Publication date: September 2019
Abstract
Inside-out pose tracking of hand-held controllers is an important problem in virtual reality devices. Current state-of-the-art combines a constellation of light-emitting diodes on controllers with a stereo pair of cameras on the head-mounted display (HMD) to track pose. These vision-based systems are unable to track controllers when they move out of the camera’s field-of-view (out-of-FOV). To overcome this limitation, we employ sensor fusion and a learning-based model. Specifically, we employ ultrasound sensors on the HMD and controllers to obtain ranging information. We combine this information with predictions from an auto-regressive forecasting model that is built with a recurrent neural network. The combination is achieved via a Kalman filter across different positional states (including out-of-FOV). With the proposed approach, we demonstrate near-isotropic accuracy levels (∼1.23 cm error) in estimating controller position, which was not possible to achieve before with camera-alone tracking.