Real-time rendering of physical scene on virtual curved mirror with RGB-D camera networks
PubDate: September 2017
Teams: University of Kentucky；University of Dayton
Writers: Po-Chang Su; Wanxin Xu; Ju Shen; Sen-ching Samson Cheung
With the recent explosive growth of Augmented Reality (AR) and Virtual Reality (VR) platforms, technology to capture and render dynamic physical space has increasingly become indispensable. In this paper, we present a novel system that can capture and render in real-time dynamic 3D scenes on a virtual arbitrarily-shaped curved mirror. To achieve a realistic mirror rendering, the reflective scenes change based on a viewer’s perspective as inferred by an eye detector. A challenge faced by mirror systems is the broad visual field required to accommodate the movement of the viewer. To overcome the limited field of view obtained from a single camera, our capturing system is based on a network of calibrated RGB-D cameras that is scalable to capture an arbitrarily large environment. The rendering is accomplished by raytracing light rays from the viewpoint to the scene reflected by the virtual curved surface. To the best of our knowledge, the proposed system is the first to render reflective dynamic scenes from real 3D data in large environments. In our experiments, we present our rendering results of using different curved surfaces.