Demonstration of VRBubble: Enhancing Peripheral Avatar Awareness for People with Visual Impairments in Social Virtual Reality
PubDate: April 2022
Teams: University of Wisconsin
Writers: Tiger F. Ji;Brianna R Cochran;Yuhang Zhao
Abstract
Social Virtual Reality (VR) use is growing for socializing and collaboration. However, current applications are not accessible to people with visual impairments (PVI) due to their focus on visual experiences. We aim to design VR technologies to enhance social VR accessibility for PVI. We focus on facilitating peripheral awareness, a vital ability in social activities. With an iterative design process involving five participants, we designed VRBubble, a VR technique to facilitate peripheral awareness for PVI via spatial audio. Based on Hall’s proxemic theory, VRBubble divides the social space with three Bubbles—Intimate, Personal, and Social Bubble—generating spatial audio feedback to distinguish avatars in different bubbles and provide suitable avatar information. We provide three audio alternatives: earcons, verbal notifications, and sound effects. PVI can select and combine their preferred feedback alternatives for different social context to maintain avatar awareness in a dynamic social VR environment.