空 挡 广 告 位 | 空 挡 广 告 位

FreeNeRF: Improving Few-shot Neural Rendering with Free Frequency Regularization

Note: We don't have the ability to review paper

PubDate: June 19, 2023

Teams: nvidia

Writers: Jiawei Yang, Marco Pavone, Yue Wang

PDF: FreeNeRF: Improving Few-shot Neural Rendering with Free Frequency Regularization

Abstract

Novel view synthesis with sparse inputs is a challenging problem for neural radiance fields (NeRF). Recent efforts alleviate this challenge by introducing external supervision, such as pre-trained models and extra depth signals, and by non-trivial patch-based rendering. In this paper, we present Frequency regularized NeRF (FreeNeRF), a surprisingly simple baseline that outperforms previous methods with minimal modifications to the plain NeRF. We analyze the key challenges in few-shot neural rendering and find that frequency plays an important role in NeRF’s training. Based on the analysis, we propose two regularization terms. One is to regularize the frequency range of NeRF’s inputs, while the other is to penalize the near-camera density fields. Both techniques are “free lunches” at no additional computational cost. We demonstrate that even with one line of code change, the original NeRF can achieve similar performance as other complicated methods in the few-shot setting. FreeNeRF achieves state-of-the-art performance across diverse datasets, including Blender, DTU, and LLFF. We hope this simple baseline will motivate a rethinking of the fundamental role of frequency in NeRF’s training under the low-data regime and beyond.

您可能还喜欢...

Paper