Making small spaces feel large: infinite walking in virtual reality
PubDate: July 2015
Teams: University of Southern California
Writers: Evan A. Suma;Mahdi Azmandian;Timofey Grechkin;Thai Phan;Mark Bolas
PDF: Making small spaces feel large: infinite walking in virtual reality
Abstract
Over the past few years, virtual reality has experienced a resurgence. Fueled by a proliferation of consumer-level head-mounted display and motion tracking devices, an unprecedented quantity of immersive experiences and content are available for both desktop and mobile platforms. However, natural locomotion in immersive virtual environments remains a significant challenge. Many of the VR applications available to date require seated use or limit body movement within a small area, instead relying a gamepad or mouse/keyboard for movement within the virtual world. Lacking support for natural walking, these virtual reality experiences do not fully replicate the physical and perceptual cues from the real world, and often fall short in maintaining the illusion that the user has been transported to another place.
We present a virtual reality demonstration that supports infinite walking within a confined physical space. This is achieved using redirected walking, a class of techniques that introduce subtle discrepancies between physical and virtual motions [Razzaque et al. 2001]. When employed properly, redirected walking can be stunningly effective. Previous research has made users believe that they walked in a straight line when they actually traveled in a wide circle, or that they walked between waypoints in a long virtual hallway when in fact they went back and forth between the same two points in the real world. While perceptually compelling, redirected walking is challenging to employ effectively in an unconstrained scenario because users’ movements may often be unpredictable. Therefore, our recent research has focused on dynamic planning and optimization of redirected walking techniques, enabling the system to intelligently apply redirection as users explore virtual environments of arbitrary size and shape [Azmandian et al. 2014b] [Azmandian et al. 2014a].
In this Emerging Technologies exhibit, attendees will explore a large-scale, outdoor immersive virtual environment in a head-mounted display (see Figure 1). The demonstration will support natural walking within a physical area of at least 6x6m, using a wide-area motion tracking system provided by PhaseSpace Inc. The virtual reality scenario will instruct users to scout the environment while stopping to take panoramic photos at various locations in the virtual world. As users explore the environment, our automated planning algorithm will dynamically apply redirection to optimally steer them away from the physical boundaries of the exhibit, thus enabling the experience of limitless walking in a potentially infinite virtual world (see Figure 2).