Gaze-aware displays and interaction
PubDate: August 2021
Teams: Katerina Mania;Ann McNamara;Andreas Polychronakis
Writers: Katerina Mania;Ann McNamara;Andreas Polychronakis
PDF: Gaze-aware displays and interaction
Abstract
Being able to detect and to employ gaze enhances digital displays. Research on gaze-contingent or gaze-aware display devices dates back two decades. This is the time, though, that it could truly be employed for fast, low-latency gaze-based interaction and for optimization of computer graphics rendering such as in foveated rendering. Moreover, Virtual Reality (VR) is becoming ubiquitous. The widespread availability of consumer grade VR Head Mounted Displays (HMDs) transformed VR to a commodity available for everyday use. VR applications are now abundantly designed for recreation, work and communication. However, interacting with VR setups requires new paradigms of User Interfaces (UIs), since traditional 2D UIs are designed to be viewed from a static vantage point only, e.g. the computer screen. Adding to this, traditional input methods such as the keyboard and mouse are hard to manipulate when the user wears a HMD. Recently, companies such as HTC announced embedded eye-tracking in their headsets and therefore, novel, immersive 3D UI paradigms embedded in a VR setup can now be controlled via eye gaze. Gaze-based interaction is intuitive and natural the users. Tasks can be performed directly into the 3D spatial context without having to search for an out-of-view keyboard/mouse. Furthermore, people with physical disabilities, already depending on technology for recreation and basic communication, can now benefit even more from VR. This course presents timely, relevant information on how gaze-contingent displays, in general, including the recent advances of Virtual Reality (VR) eye tracking capabilities can leverage eye-tracking data to optimize the user experience and to alleviate usability issues surrounding intuitive interaction challenges. Research topics to be covered include saliency models, gaze prediction, gaze tracking, gaze direction, foveated rendering, stereo grading and 3D User Interfaces (UIs) based on gaze on any gaze-aware display technology.