空 挡 广 告 位 | 空 挡 广 告 位

Real-Time Lighting Estimation for Augmented Reality via Differentiable Screen-Space Rendering

Note: We don't have the ability to review paper

PubDate: January 2022

Teams: OPPO US Research Center

Writers: Celong Liu; Palo Alto, California, United States,

PDF: Real-Time Lighting Estimation for Augmented Reality via Differentiable Screen-Space Rendering

Abstract

Augmented Reality (AR) applications aim to provide realistic blending between the real-world and virtual objects. One of the important factors for realistic AR is the correct lighting estimation. In this paper, we present a method that estimates the real-world lighting condition from a single image in real-time, using information from an optional support plane provided by advanced AR frameworks (e.g. ARCore, ARKit, etc.). By analyzing the visual appearance of the real scene, our algorithm could predict the lighting condition from the input RGB photo. In the first stage, we use a deep neural network to decompose the scene into several components: lighting, normal, and BRDF. Then we introduce differentiable screen-space rendering, a novel approach to providing the supervisory signal for regressing lighting, normal, and BRDF jointly. We recover the most plausible real-world lighting condition using Spherical Harmonics and the main directional lighting. Through a variety of experimental results, we demonstrate that our method could provide improved results than prior works quantitatively and qualitatively, and it could enhance the real-time AR experiences.

您可能还喜欢...

Paper