FocusFlow: Leveraging Focal Depth for Gaze Interaction in Virtual Reality
PubDate:Oct 2023
Teams: University of Illinois at Urbana-Champaign
Writers:Chenyang Zhang,Tiansu Chen,Rohan Russel Nedungadi,Eric Shaffer,Elahe Soltanaghai
PDF:FocusFlow: Leveraging Focal Depth for Gaze Interaction in Virtual Reality
Abstract
Current gaze input methods for VR headsets predominantly utilize the gaze ray as a pointing cursor, often neglecting depth information in it. This study introduces FocusFlow, a novel gaze interaction technique that integrates focal depth into gaze input dimensions, facilitating users to actively shift their focus along the depth dimension for interaction. A detection algorithm to identify the user’s focal depth is developed. Based on this, a layer-based UI is proposed, which uses focal depth changes to enable layer switch operations, offering an intuitive hands-free selection method. We also designed visual cues to guide users to adjust focal depth accurately and get familiar with the interaction process. Preliminary evaluations demonstrate the system’s usability, and several potential applications are discussed. Through FocusFlow, we aim to enrich the input dimensions of gaze interaction, achieving more intuitive and efficient human-computer interactions on headset devices.