空 挡 广 告 位 | 空 挡 广 告 位

Adaptive Light Estimation using Dynamic Filtering for Diverse Lighting Conditions

Note: We don't have the ability to review paper

PubDate: August 2021

Teams: Victoria University of Wellington

Writers: Junhong Zhao; Andrew Chalmers; Taehyun Rhee

PDF: Adaptive Light Estimation using Dynamic Filtering for Diverse Lighting Conditions

Abstract

High dynamic range (HDR) panoramic environment maps are widely used to illuminate virtual objects to blend with real-world scenes. However, in common applications for augmented and mixed-reality (AR/MR), capturing 360° surroundings to obtain an HDR environment map is often not possible using consumer-level devices. We present a novel light estimation method to predict 360° HDR environment maps from a single photograph with a limited field-of-view (FOV). We introduce the Dynamic Lighting network (DLNet), a convolutional neural network that dynamically generates the convolution filters based on the input photograph sample to adaptively learn the lighting cues within each photograph. We propose novel Spherical Multi-Scale Dynamic (SMD) convolutional modules to dynamically generate sample-specific kernels for decoding features in the spherical domain to predict 360° environment maps. Using DLNet and data augmentations with respect to FOV, an exposure multiplier, and color temperature, our model shows the capability of estimating lighting under diverse input variations. Compared with prior work that fixes the network filters once trained, our method maintains lighting consistency across different exposure multipliers and color temperature, and maintains robust light estimation accuracy as FOV increases. The surrounding lighting information estimated by our method ensures coherent illumination of 3D objects blended with the input photograph, enabling high fidelity augmented and mixed reality supporting a wide range of environmental lighting conditions and device sensors.

您可能还喜欢...

Paper