空 挡 广 告 位 | 空 挡 广 告 位

Walk a Robot Dog in VR!

Note: We don't have the ability to review paper

PubDate: August 2020

Teams: UNC Chapel Hill

Writers: Nicholas Rewkowski;Ming Lin

PDF: Walk a Robot Dog in VR!

Abstract

Realistic locomotion in a virtual environment (VE) can help maximize immersion and decrease simulator sickness. Redirected walking (RDW) allows a user to physically walk in VR by rotating the VE as a function of head rotation such that they walk in an arc that fits in the tracking area. However, this requires significant user rotation, often requiring a “distractor” to cause such rotation in commercial tracking spaces. Previous implementations suddenly spawned a distractor (e.g. butterfly) when the user walks near the safe boundary, with limitations like the user causing distraction accidentally by looking around, the distractor not being acknowledged, or getting “stuck” in a corner. We explore a persistent, robot distractor tethered to the user that provides two-way haptic feedback and natural motion constraints. We design a dynamic robot AI which adapts to randomness in the user’s behavior, as well as trajectory changes caused by tugging on its leash. The robot tries to imperceptibly keep the user safe by replicating a real dog’s behaviors, such as barking or sniffing something. We hypothesize that the naturalness of the dog behavior, its responses to the user, and the haptic tethering will work together to allow the user to explore the entire city, ideally without noticing that the dog is a robot.

您可能还喜欢...

Paper