OmniNeRF: Hybriding Omnidirectional Distance and Radiance fields for Neural Surface Reconstruction

Note: We don't have the ability to review paper

PubDate: Sep 2022

Teams: The University of Sydney;Massachusetts Institute of Technology’, Cambridge,
MA, United States; Tsinghua University;Beijing Institute of Technology;The University of New South Wales

Writers: Jiaming Shen, Bolin Song, Zirui Wu, Yi Xu

PDF: OmniNeRF: Hybriding Omnidirectional Distance and Radiance fields for Neural Surface Reconstruction

Abstract

3D reconstruction from images has wide applications in Virtual Reality and Automatic Driving, where the precision requirement is very high. Ground-breaking research in the neural radiance field (NeRF) by utilizing Multi-Layer Perceptions has dramatically improved the representation quality of 3D objects. Some later studies improved NeRF by building truncated signed distance fields (TSDFs) but still suffer from the problem of blurred surfaces in 3D reconstruction. In this work, this surface ambiguity is addressed by proposing a novel way of 3D shape representation, OmniNeRF. It is based on training a hybrid implicit field of Omni-directional Distance Field (ODF) and neural radiance field, replacing the apparent density in NeRF with omnidirectional information. Moreover, we introduce additional supervision on the depth map to further improve reconstruction quality. The proposed method has been proven to effectively deal with NeRF defects at the edges of the surface reconstruction, providing higher quality 3D scene reconstruction results.

You may also like...

Paper