雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Multisensory cues facilitate coordination of stepping movements with a virtual reality avatar

Note: We don't have the ability to review paper

PubDate: Jun 2019

Teams: University of Warwick

Writers: Omar Khan, Imran Ahmed, Joshua Cottingham, Musa Rahhal, Theodoros N Arvanitis, Mark Elliott

PDF: Multisensory cues facilitate coordination of stepping movements with a virtual reality avatar

Abstract

The effectiveness of simple sensory cues for retraining gait have been demonstrated, yet the feasibility of humanoid avatars for entrainment have yet to be investigated. Here, we describe the development of a novel method of visually cued training, in the form of a virtual partner, and investigate its ability to provide movement guidance in the form of stepping. Real stepping movements were mapped onto an avatar using motion capture data. The trajectory of one of the avatar step cycles was then accelerated or decelerated by 15% to create a perturbation. Healthy participants were motion captured while instructed to step in time to the avatar’s movements, as viewed through a virtual reality headset. Step onset times were used to measure the timing errors (asynchronies) between them. Participants completed either a visual-only condition, or auditory-visual with footstep sounds included. Participants’ asynchronies exhibited slow drift in the Visual-Only condition, but became stable in the Auditory-Visual condition. Moreover, we observed a clear corrective response to the phase perturbation in both auditory-visual conditions. We conclude that an avatar’s movements can be used to influence a person’s own gait, but should include relevant auditory cues congruent with the movement to ensure a suitable accuracy is achieved.

您可能还喜欢...

Paper