Human-Robot Interaction in a Shared Augmented Reality Workspace
PubDate: Jul 2020
Teams: UCLA Center for Vision
Writers: Shuwen Qiu, Hangxin Liu, Zeyu Zhang, Yixin Zhu, Song-Chun Zhu
PDF: Human-Robot Interaction in a Shared Augmented Reality Workspace
Abstract
We design and develop a new shared Augmented Reality (AR) workspace for Human-Robot Interaction (HRI), which establishes a bi-directional communication between human agents and robots. In a prototype system, the shared AR workspace enables a shared perception, so that a physical robot not only perceives the virtual elements in its own view but also infers the utility of the human agent–the cost needed to perceive and interact in AR–by sensing the human agent’s gaze and pose. Such a new HRI design also affords a shared manipulation, wherein the physical robot can control and alter virtual objects in AR as an active agent; crucially, a robot can proactively interact with human agents, instead of purely passively executing received commands. In experiments, we design a resource collection game that qualitatively demonstrates how a robot perceives, processes, and manipulates in AR and quantitatively evaluates the efficacy of HRI using the shared AR workspace. We further discuss how the system can potentially benefit future HRI studies that are otherwise challenging.