雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Inferring Private Personal Attributes of Virtual Reality Users from Head and Hand Motion Data

Note: We don't have the ability to review paper

PubDate: May 2023

Teams: UC Berkeley;University of Wurzburg

Writers: Vivek Nair, Christian Rack, Wenbo Guo, Rui Wang, Shuixian Li, Brandon Huang, Atticus Cull, James F. O’Brien, Louis Rosenberg, Dawn Song

PDF: Inferring Private Personal Attributes of Virtual Reality Users from Head and Hand Motion Data

Abstract

Motion tracking “telemetry” data lies at the core of nearly all modern virtual reality (VR) and metaverse experiences. While generally presumed innocuous, recent studies have demonstrated that motion data actually has the potential to uniquely identify VR users. In this study, we go a step further, showing that a variety of private user information can be inferred just by analyzing motion data recorded by VR devices. We conducted a large-scale survey of VR users (N=1,006) with dozens of questions ranging from background and demographics to behavioral patterns and health information. We then collected VR motion samples of each user playing the game “Beat Saber,” and attempted to infer their survey responses using just their head and hand motion patterns. Using simple machine learning models, many of these attributes could accurately and consistently be inferred from VR motion data alone, highlighting the pressing need for privacy-preserving mechanisms in multi-user VR applications.

您可能还喜欢...

Paper