Viewpoint Quality Evaluation for Augmented Virtual Environment
PubDate: September 2018
Teams: Beihang University
Writers: Ming MengYi ZhouChong TanZhong Zhou
Augmented Virtual Environment (AVE) fuses real-time video streaming with virtual scenes to provide a new capability of the real-world run-time perception. Although this technique has been developed for many years, it still suffers from the fusion correctness, complexity and the image distortion during flying. The image distortion could be commonly found in an AVE system, which is decided by the viewpoint of the environment. Existing work lacks of the evaluation of the viewpoint quality, and then failed to optimize the fly path for AVE. In this paper, we propose a novel method of viewpoint quality evaluation (VQE), taking texture distortion as evaluation metric. The texture stretch and object fragment are taken as the main factors of distortion. We visually compare our method with viewpoint entropy on campus scene, demonstrating that our method is superior in reflecting distortion degree. Furthermore, we conduct a user study, revealing that our method is suitable for the good quality demonstration with viewpoint control for AVE.