Virtual Humans in Augmented Reality: A First Step towards Real-World Embedded Virtual Roleplayers
PubDate: September 2019
Teams: University of Southern California,USC Institute for Creative Technologies
Writers: Arno Hartholt;Sharon Mozgai;Ed Fast;Matt Liewer;Adam Reilly;Wendy Whitcup;Albert “Skip” Rizzo
PDF: Virtual Humans in Augmented Reality: A First Step towards Real-World Embedded Virtual Roleplayers
Abstract
We present one of the first applications of virtual humans in Augmented Reality (AR), which allows young adults with Autism Spectrum Disorder (ASD) the opportunity to practice job interviews. It uses the Magic Leap’s AR hardware sensors to provide users with immediate feedback on six different metrics, including eye gaze, blink rate and head orientation. The system provides two characters, with three conversational modes each. Ported from an existing desktop application, the main development lessons learned were: 1) provide users with navigation instructions in the user interface, 2) avoid dark colors as they are rendered transparently, 3) use dynamic gaze so characters maintain eye contact with the user, 4) use hardware sensors like eye gaze to provide user feedback, and 5) use surface detection to place characters dynamically in the world.