Multimodal Data Integration for Interactive and Realistic Avatar Simulation in Augmented Reality
PubDate: November 2021
Teams: University of Miami
Writers: Anchen Sun; Yudong Tao; Mei-Ling Shyu; Shu-Ching Chen; Angela Blizzard; William Andrew Rothenberg; Dainelys Garcia; Jason F Jent
The affordance of Augmented Reality (AR) allows users to view the virtual objects along with real world simultaneously, which provides more realistic and immersive experiences to interact with them. In this paper, we propose to integrate data with different modalities (such as animation, audio, and user behavior data) to generate a realistic child avatar in the AR environment that is responsive to the user’s behaviors. The proposed framework has three components: avatar interaction system, avatar action control system, and avatar display system. The avatar interaction system leverages user behavior data to allow for reasonable interactions of the avatar, the avatar action control system integrates actions and audio data to generate realistic avatar actions, and the avatar display system presents the avatar to the user via the AR interface. Furthermore, a child tantrum management training application is implemented based on the proposed system to allow users to experience child tantrum and learn how to respond and manage different child tantrum situations. The application based on our system can run in real-time with an average of 93.41 fps. Based on the qualitative evaluation, the simulated child avatar has shown to be realistic enough and is able to respond to the user’s gaze instantly in the AR environment. The action, reaction, treatment, and mitigation of the child avatar behavior among different tantrum levels in the avatar action controller are accurately represented through the clinical evaluation experiences from pediatricians and trained psychologists.