空 挡 广 告 位 | 空 挡 广 告 位

StyleMesh: Style Transfer for Indoor 3D Scene Reconstructions

Note: We don't have the ability to review paper

PubDate: Sep 2022

Teams:  Technical University of Munich;University of Michigan

Writers: Lukas Höllein; Justin Johnson; Matthias Nießner

PDF:StyleMesh: Style Transfer for Indoor 3D Scene Reconstructions

Abstract

We apply style transfer on mesh reconstructions of indoor scenes. This enables VR applications like experiencing 3D environments painted in the style of a favorite artist. Style transfer typically operates on 2D images, making stylization of a mesh challenging. When optimized over a variety of poses, stylization patterns become stretched out and inconsistent in size. On the other hand, model-based 3D style transfer methods exist that allow stylization from a sparse set of images, but they require a network at inference time. To this end, we optimize an explicit texture for the reconstructed mesh of a scene and stylize it jointly from all available input images. Our depth- and angle-aware optimization leverages surface normal and depth data of the underlying mesh to create a uniform and consistent stylization for the whole scene. Our experiments show that our method creates sharp and detailed results for the complete scene without view-dependent artifacts. Through extensive ablation studies, we show that the proposed 3D awareness enables style transfer to be applied to the 3D domain of a mesh. Our method 1 1 https://lukashoel.github.io/stylemesh/ can be used to render a stylized mesh in real-time with traditional rendering pipelines.

您可能还喜欢...

Paper