DualGaze: Addressing the Midas Touch Problem in Gaze Mediated VR Interaction
PubDate: April 2019
Teams: Nanyang Technological University；Nanyang Technological University；The Chinese University of Hong Kong；The Hong Kong University of Science and Technology
Writers: Pallavi Mohan; Wooi Boon Goh; Chi-Wing Fu; Sai-Kit Yeung
With the increasing acceptance of eye tracking as a viable interaction method for Virtual Reality (VR) headsets, thoughtful gaze interaction methods need to be carefully designed to meet common challenges such as the Midas Touch problem, where users unintentionally select onscreen objects by gazing upon them. This paper presents DualGaze, a novel interaction method in which users perform a distinctive two-step gaze gesture for object selection. Once users gaze upon an object that they wish to select, a confirmation flag pops up next to the object at a location where the users’ gaze just passed through. This trajectory-adaptive flag placement strategy reduces the chance of unintentional confirmation by requiring a returning gaze back to the flag. We conducted a user study to compare the accuracy and selection speed of DualGaze and the popular gaze fixation method on a simple gaze-typing task. Our results show that DualGaze is significantly more accurate while maintaining a comparable selection speed that was observed to improve with familiarity of use.