The Effects of Robots' Use of Abstract Pointing Gestures in Large-Scale Environments
Date:March 2024
Teams:Colorado School of Mines
Writers:Annie Huang,Author PictureAlyson Ranucci,Author PictureAdam Stogsdill,Author PictureGrace Clark,Author PictureKeenan Schott,Author PictureMark Higger,Author PictureZhao Han,Author PictureTom Williams
PDF:The Effects of Robots' Use of Abstract Pointing Gestures in Large-Scale Environments
Abstract
As robots are deployed into large-scale human environments, they will need to engage in task-oriented dialogues about objects and locations beyond those that can currently be seen. In these contexts, speakers use a wide range of referring gestures beyond those used in the small-scale interaction contexts that HRI research typically investigates. In this work, we thus seek to understand how robots can better generate gestures to accompany their referring language in large-scale interaction contexts. In service of this goal, we present the results of two human-subject studies: (1) a human-human study exploring how human gestures change in large-scale interaction contexts, and to identify human-like gestures suitable to such contexts yet readily implemented on robot hardware; and (2) a human-robot study conducted in a tightly controlled Virtual Reality environment, to evaluate robots' use of those identified gestures. Our results show that robot use of Precise Deictic and Abstract Pointing gestures afford different types of benefits when used to refer to visible vs. non-visible referents, leading us to formulate three concrete design guidelines. These results highlight both the opportunities for robot use of more humanlike gestures in large-scale interaction contexts, as well as the need for future work exploring their use as part of multi-modal communication.