Hearing with Eyes in Virtual Reality
PubDate: August 2019
Teams: Aalborg Universitet
Writers: Amalie Rosenkvist; David Sebastian Eriksen; Jeppe Koehlert; Miicha Valimaa; Mikkel Brogaard Vittrup; Anastasia Andreasen;George Palamas
PDF: Hearing with Eyes in Virtual Reality
Abstract
Sound and light signal propagation have similar physical properties. This provides inspiration for creating an audio-visual echolocation system, where light is mapped to the sound signal, visually representing auralization of the virtual environment (VE). Some mammals navigate using echolocation; however humans are less successful with this. To the authors’ knowledge, it remains to be seen if sound propagation and its visualization have been implemented in a perceptually pleasant way and is used for navigation purposes in the VE. Therefore, the core novelty of this research is navigation with visualized echolocation signal using a cognitive mental mapping activity in the VE.