雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Variable Bitrate Neural Fields

Note: We don't have the ability to review paper

PubDate: May 2022

Teams: NVIDIA, University of Toronto,ROBLOX, University of Waterloo,Adobe Research

Writers: Towaki Takikawa1,2Alex Evans1Jonathan Tremblay1Thomas Müller1 Morgan McGuire3,4Alec Jacobson2,5Sanja Fidler1,2

PDF: Variable Bitrate Neural Fields

Project: Variable Bitrate Neural Fields

Abstract

Neural approximations of scalar- and vector fields, such as signed distance functions and radiance fields, have emerged as accurate, high-quality rep-resentations. State-of-the-art results are obtained by conditioning a neural approximation with a lookup from trainable feature grids [Liu et al. 2020; Mar-tel et al. 2021; Müller et al. 2022; Takikawa et al. 2021] that take on part of the learning task and allow for smaller, more efficient neural networks. Unfortunately, these feature grids usually come at the cost of significantly increased memory consumption compared to stand-alone neural network models. We present a dictionary method for compressing such feature grids, reducing their memory consumption by up to 100× and permitting a multiresolution representation which can be useful for out-of-core streaming. We formulate the dictionary optimization as a vector-quantized auto-decoder problem which lets us learn end-to-end discrete neural representations in a space where no direct supervision is available and with dynamic topology and structure. Our source code is available at https://github.com/nv-tlabs/vqad.

您可能还喜欢...

Paper