High-quality First-person Rendering Mixed Reality Gaming System for In Home Setting
PubDate: January 2021
Teams: The University of Texas at Dallas
Writers: Yu-Yen Chung; Hung-Jui Guo; Hiranya Garbha Kumar; Balakrishnan Prabhakaran
With the advent of low-cost RGB-D cameras, mixed reality serious games using `live’ 3D human avatars have become popular. Here, RGB-D cameras are used for capturing and transferring user’ motion and texture onto the 3D human avatar in virtual environments. A system with a single camera is more suitable for such mixed reality games deployed in homes, considering the ease of setting up the system. In these mixed reality games, users can have either a third-person perspective or a first-person perspective of the virtual environments used in the games. Since first-person perspective provides a better Sense of Embodiment (SoE), in this paper, we explore the problem of providing a first-person perspective for mixed reality serious games played in homes. We propose a real time textured humanoid-avatar framework to provide a first-person perspective and address the challenges involved in setting up such a gaming system in homes. Our approach comprises: (a) SMPL humanoid model optimization for capturing user’ movements continuously; (b) a real-time texture transferring and merging OpenGL pipeline to build a global texture atlas across multiple video frames. We target the proposed approach towards a serious game for amputees, called Mr.MAPP (Mixed Reality-based framework for Managing Phantom Pain), where amputee’ intact limb is mirrored in real-time in the virtual environment. For this purpose, our framework also introduces a mirroring method to generate a textured phantom limb in the virtual environment. We carried out a series of visual and metrics-based studies to evaluate the effectiveness of the proposed approaches for skeletal pose fitting and texture transfer to SMPL humanoid models, as well as the mirroring and texturing missing limb (for future amputee based studies).