空 挡 广 告 位 | 空 挡 广 告 位

Auditory Augmented Reality: Object Sonification for the Visually Impaired

Note: We don't have the ability to review paper

Title: Auditory Augmented Reality: Object Sonification for the Visually Impaired

Teams: Microsoft

Writers: Flavio Protasio Ribeiro Dinei Florencio Philip A. Chou Zhengyou Zhang

Publication date: September 2012

Abstract

Augmented reality applications have focused on visually integrating virtual objects into real environments. In this paper, we propose an auditory augmented reality, where we integrate acoustic virtual objects into the real world. We sonify objects that do not intrinsically produce sound, with the purpose of revealing additional information about them. Using spatialized (3D) audio synthesis, acoustic virtual objects are placed at specific real-world coordinates, obviating the need to explicitly tell the user where they are. Thus, by leveraging the innate human capacity for 3D sound source localization and source separation, we create an audio natural user interface. In contrast with previous work, we do not create acoustic scenes by transducing low-level (for instance, pixel-based) visual information. Instead, we use computer vision methods to identify high-level features of interest in an RGB-D stream, which are then sonified as virtual objects at their respective real-world coordinates. Since our visual and auditory senses are inherently spatial, this technique naturally maps between these two modalities, creating intuitive representations. We evaluate this concept with a head-mounted device, featuring modes that sonify flat surfaces, navigable paths and human faces.

您可能还喜欢...

Paper