OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras
PubDate:
Teams: University of Bath;Brown University
Writers: Joanna Tarko; James Tompkin; Christian Richardt
Abstract
We propose a new approach for creating omnidirectional mixed reality (OmniMR) from moving-camera 360° video. To insert virtual computer-generated elements into a moving-camera 360° video, we reconstruct camera motion and sparse scene content via structure from motion on stitched equirectangular video (the default output format of current 360° cameras). Then, to plausibly reproduce realworld lighting conditions for these inserted elements, we employ inverse tone mapping to recover high dynamic range environment maps which vary spatially along the camera path. We implement our approach into the Unity rendering engine for real-time object rendering with dynamic lighting and user interaction. This expands the use and flexibility of 360° video for mixed reality.