Real-Time Mixed Reality Rendering for Underwater 360° Videos
PubDate: December 2019
Teams: Victoria University of Wellington
Writers: Stephen Thompson; Andrew Chalmers; Taehyun Rhee
We present a novel mixed reality (MR) rendering and composition solution that illuminates and blends virtual objects into underwater 360° videos (360-video) in real-time. Real-time underwater lighting (caustics, god rays, fog, and particulates) were developed to improve the overall lighting and blending quality. We also provide a MR toolkit, an interface to tune the parameters of the underwater lighting so the user can match the lighting observed in the 360-video. Our image based lighting provides automatic ambient and high frequency underwater lighting. This ensures that the virtual objects are lit and blend similarly to each frame of the video semi-automatically and in real-time. We conducted a user study by having participants rate our method based on the visual quality and presence using a five point Likert Scale. The results show that our underwater lighting is preferred over no underwater effects or using naive ambient lighting. We also have a few takeaways on what elements of our underwater lighting and interaction have a significant impact on visual quality and presence in underwater MR.