Improving the perception of a sound source’s polar angle in mixed reality
Title: Improving the perception of a sound source’s polar angle in mixed reality
Teams: Microsoft
Writers: Hannes Gamper Ivan Tashev
Publication date: May 2018
Abstract
Mixed reality applications blend real and virtual scenes. To render virtual objects in a scene, the rendering system needs to accurately control their perceived location. In the acoustic domain, the location of a sound source is often given in interaural coordinates, i.e., as the lateral and polar angle and distance relative to the midpoint of the listener’s interaural axis. This description is useful as it allows separating the effect of various perceptual cues on each interaural spatial dimension. Prior research has shown that the human perception of a sound source’s polar angle, i.e., the angle of rotation about the interaural axis, tends to be less accurate than the perception of its lateral angle, i.e., the angle off the median plane. When rendering virtual sound sources, listeners often confuse locations above and below the horizontal plane or in the front and in the back. Here, we review cues that affect the perception of polar angle as well as approaches to improve the accuracy of rendering the polar angle of a virtual sound source in mixed reality applications.