空 挡 广 告 位 | 空 挡 广 告 位

Directional sources and listeners in interactive sound propagation using reciprocal wave field coding

Note: We don't have the ability to review paper

PubDate: July 2020

Teams: Microsoft;McGill University

Writers: Chakravarty R. Alla Chaitanya;Nikunj Raghuvanshi;Keith W. Godin;Zechen Zhang;Derek Nowrouzezahrai;John M. Snyder

PDF: Directional sources and listeners in interactive sound propagation using reciprocal wave field coding

Abstract

Common acoustic sources, like voices or musical instruments, exhibit strong frequency and directional dependence. When transported through complex environments, their anisotropic radiated field undergoes scattering, diffraction, and occlusion before reaching a directionally-sensitive listener. We present the first wave-based interactive auralization system that encodes and renders a complete reciprocal description of acoustic wave fields in general scenes. Our method renders directional effects at freely moving and rotating sources and listeners and supports any tabulated source directivity function and head-related transfer function. We represent a static scene’s global acoustic transfer as an 11-dimensional bidirectional impulse response (BIR) field, which we extract from a set of wave simulations. We parametrically encode the BIR as a pair of radiating and arriving directions for the perceptually-salient initial (direct) response, and a compact 6 × 6 reflections transfer matrix capturing indirect energy transfer with scene-dependent anisotropy. We render our encoded data with an efficient and scalable algorithm – integrated in the Unreal Engine™ – whose CPU performance is agnostic to scene complexity and angular source/listener resolutions. We demonstrate convincing effects that depend on detailed scene geometry, for a variety of environments and source types.

您可能还喜欢...

Paper