空 挡 广 告 位 | 空 挡 广 告 位

Depth-based subtle gaze guidance in virtual reality environments

Note: We don't have the ability to review paper

PubDate: September 2015

Teams: Rochester Institute of Technology

Writers: Srinivas Sridharan;James Pieszala;Reynold Bailey

PDF: Depth-based subtle gaze guidance in virtual reality environments

Abstract

Virtual reality headsets and immersive head-mounted displays have become commonplace and have found their applications in digital gaming, film and education. An immersive perception is created by surrounding the user of the VR system with photo-realisitic scenes, sound or other stimuli (e.g. haptic) that provide an engrossing experience to the viewer. The ability to interact with the objects in the virtual environment have added greater interest for its use in learning and education. In this proposed work we plan to explore the ability to subtly guide viewers’ attention to important regions in a controlled 3D virtual scene. Subtle gaze guidance [Bailey et al. 2009] approach combines eye-tracking and subtle imagespace modulations to guide viewer’s attention about a scene. These modulations are terminated before the viewer can fixate on them using their high acuity foveal vision. This approach is preferred over other overt techniques that make permanent changes to the scene being viewed. This approach has also been tested in controlled realworld environments [Booth et al. 2013]. The key challenge to such a system, is the need for an external projector to present modulations on the scene objects to guide viewer’s attention. However a VR system enables the user to view and interact in a 3D scene that is close to reality, thereby allowing researchers to digitally manipulate the 3D scene for active gaze guidance.

您可能还喜欢...

Paper