Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation

Note: We don't have the ability to review paper

Title: Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation

Teams: Microsoft

Writers: Yuhang Zhao Cynthia Bennett Hrvoje Benko Ed Cutrell Christian Holz Meredith Ringel Morris Mike Sinclair

Publication date: April 2018

Abstract

Traditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. We created Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world. Canetroller provides three types of feedback: (1) physical resistance generated by a wearable programmable brake mechanism that physically impedes the controller when the virtual cane comes in contact with a virtual object; (2) vibrotactile feedback that simulates the vibrations when a cane hits an object or touches and drags across various surfaces; and (3) spatial 3D auditory feedback simulating the sound of real-world cane interactions. We designed indoor and outdoor VR scenes to evaluate the effectiveness of our controller. Our study showed that Canetroller was a promising tool that enabled visually impaired participants to navigate different virtual spaces. We discuss potential applications supported by Canetroller ranging from entertainment to mobility training.

Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation

您可能还喜欢...

Paper