Parallax360: Stereoscopic 360° Scene Representation For Head-Motion Parallax
PubDate: January 2018
Teams: Tsinghua University;University of Bath
Writers: Bicheng Luo; Feng Xu; Christian Richardt; Jun-Hai Yong
PDF: Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax
Abstract
We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts. Based on our scene representation, we present an end-to-end system that captures real scenes with a robotic camera arm, processes the recorded data, and finally renders the scene in a head-mounted display in real time (more than 40 Hz). Our approach is the first to support head-motion parallax when viewing real 360° scenes. We demonstrate compelling results that illustrate the enhanced visual experience - and hence sense of immersion-achieved with our approach compared to widely-used stereoscopic panoramas.