雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Local Light Field Fusion: Practical View Synthesis with Prescriptive Sampling Guidelines

Note: We don't have the ability to review paper

PubDate: May 2019

Teams: UC Berkeley;Fyusion Inc;UC San Diego

Writers: Ben Mildenhall, Pratul P. Srinivasan, Rodrigo Ortiz-Cayon, Nima Khademi Kalantari, Ravi Ramamoorthi, Ren Ng, Abhishek Kar

PDF: Local Light Field Fusion: Practical View Synthesis with Prescriptive Sampling Guidelines

Project: Local Light Field Fusion: Practical View Synthesis with Prescriptive Sampling Guidelines

Abstract

We present a practical and robust deep learning solution for capturing and rendering novel views of complex real world scenes for virtual exploration. Previous approaches either require intractably dense view sampling or provide little to no guidance for how users should sample views of a scene to reliably render high-quality novel views. Instead, we propose an algorithm for view synthesis from an irregular grid of sampled views that first expands each sampled view into a local light field via a multiplane image (MPI) scene representation, then renders novel views by blending adjacent local light fields. We extend traditional plenoptic sampling theory to derive a bound that specifies precisely how densely users should sample views of a given scene when using our algorithm. In practice, we apply this bound to capture and render views of real world scenes that achieve the perceptual quality of Nyquist rate view sampling while using up to 4000x fewer views. We demonstrate our approach’s practicality with an augmented reality smartphone app that guides users to capture input images of a scene and viewers that enable realtime virtual exploration on desktop and mobile platforms.

您可能还喜欢...

Paper