Evaluation of real-time sound propagation engines in a virtual reality framework
Title: Evaluation of real-time sound propagation engines in a virtual reality framework
Teams: Facebook
Writers: Sebastià V. Amengual Garí, Carl Schissler, Ravish Mehra, Shawn Featherly, Philip W. Robinson
Publication date: March 27, 2019
Abstract
Sound propagation in an enclosed space is a combination of several wave phenomena, such as direct sound, specular reflections, scattering, diffraction, or air absorption, among others. Achieving realistic and immersive audio in games and virtual reality (VR) requires real-time modeling of these phenomena. Given that it is not clear which of the sound propagation aspects are perceptually more relevant in VR scenarios, an objective and perceptual comparison is conducted between two different approaches: one based on rendering only specular reflections of a geometrically simplified room (image source model – ISM), and another one based on ray-tracing using custom geometries. The objective comparison analyzes the simulation results of these engines and compare them with those of a room acoustic modeling commercial software package (Odeon), commonly employed in auralization of room acoustics. The perceptual evaluation is implemented in an immersive VR framework, where subjects are asked to compare the audio rendering approaches in an ecologically valid environment. In addition, this framework allows systematic perceptual experiments by rapidly modifying the test paradigm and the virtual scenes. The results suggest that the engine based on ISM is subjectively more preferred in small to medium rooms, while large reverberant spaces are more accurately rendered using a ray-tracing approach. Thus, a combination of both methods could represent a more appropriate approach to a larger variety of rooms.