Texture Blending for Photorealistic Composition on Mobile AR Platform
PubDate: May 2021
Teams: Xiamen University
Writers: Guilin Li; Pan Chen; Shihui Guo
PDF: Texture Blending for Photorealistic Composition on Mobile AR Platform
Abstract
This work establishes a client-cloud collaborative computing framework, realizing the augmented reality synthesis of photos on the mobile platform. This framework solves the bottleneck of unrealistic image rendering caused by the insufficient computing power of mobile hardware in augmented reality synthesis. On the client side, a real-time preview function is provided for users to perceive the final result. On the cloud side, a complete rendering pipeline with high computing is used to achieve highly realistic picture rendering and composition. On the basis of augmenting reality, this framework increases the interaction between humans and virtual models. We propose a method to integrate user interaction input and image semantic information to solve the limitation of inaccurate estimation of depth information in distant area. Finally, this work takes the augmented reality display of virtual cherry tree as an example to build a sample program. The result shows that the framework is of high practicability and can automatically compose highly realistic photos.