Machine Learning Based Auralization of Rigid Sphere Scattering
PubDate: November 2021
Teams: Aalto University
Writers: Stefan Wirler; Sebastian J. Schlecht; Ville Pulkki
PDF: Machine Learning Based Auralization of Rigid Sphere Scattering
Abstract
In this paper, we present a method to auralize acoustic scattering and occlusion of a single rigid sphere with parametric filters and neural networks to provide fast processing and estimation of parameters. The filter parameters are estimated using neural networks based on the geometric parameters of the simulated scene, e.g., relative receiver position and size of the rigid spherical scatterer. The modeling differentiates an unoccluded and an occluded source-receiver path, for which different filter structures were used. In contrast to simulating occlusion and scattering numerically or analytically methods, the proposed approach provides rendering with low computational load making it suitable for real-time auralization in virtual reality. The presented method provides a good fit for modeling the acoustic effects of a rigid sphere. Further, a listening test was conducted, which resulted in plausible reproduction of the scattering and occlusion of a rigid sphere.