Filtering Environment Illumination for Interactive Physically-Based Rendering in Mixed Reality
Title: Filtering Environment Illumination for Interactive Physically-Based Rendering in Mixed Reality
Teams: Nvidia;UC Berkeley;UC San Diego
Writers: Soham Uday Mehta (UC Berkeley);Kihwan Kim;Dawid Pajak (NVIDIA);Kari Pulli (NVIDIA);Jan Kautz;Ravi Ramamoorthi (UC San Diego)
Publication date: June 1, 2015
Abstract
Physically correct rendering of environment illumination has been a long-standing challenge in interactive graphics, since Monte-Carlo ray-tracing requires thousands of rays per pixel. We propose accurate filtering of a noisy Monte-Carlo image using Fourier analysis. Our novel analysis extends previous works by showing that the shape of illumination spectra is not always a line or wedge, as in previous approximations, but rather an ellipsoid. Our primary contribution is an axis-aligned filtering scheme that preserves the frequency content of the illumination. We also propose a novel application of our technique to mixed reality scenes, in which virtual objects are inserted into a real video stream so as to become indistinguishable from the real objects. The virtual objects must be shaded with the real lighting conditions, and the mutual illumination between real and virtual objects must also be determined. For this, we demonstrate a novel two-mode path tracing approach that allows ray-tracing a scene with image-based real geometry and mesh-based virtual geometry. Finally, we are able to de-noise a sparsely sampled image and render physically correct mixed reality scenes at over 5 fps on the GPU.