QuestEnvSim: Environment-Aware Simulated Motion Tracking from Sparse Sensors

Note: We don't have the ability to review paper

PubDate: June 2023

Teams: Seoul National University;Meta

Writers: Sunmin Lee, Sebastian Starke, Yuting Ye, Jungdam Won, Alexander Winkler

PDF: QuestEnvSim: Environment-Aware Simulated Motion Tracking from Sparse Sensors

Abstract

Replicating a user’s pose from only wearable sensors is important for many AR/VR applications. Most existing methods for motion tracking avoid environment interaction apart from foot-floor contact due to their complex dynamics and hard constraints. However, in daily life people regularly interact with their environment, e.g. by sitting on a couch or leaning on a desk. Using Reinforcement Learning, we show that headset and controller pose, if combined with physics simulation and environment observations can generate realistic full-body poses even in highly constrained environments. The physics simulation automatically enforces the various constraints necessary for realistic poses, instead of manually specifying them as in many kinematic approaches. These hard constraints allow us to achieve high-quality interaction motions without typical artifacts such as penetration or contact sliding. We discuss three features, the environment representation, the contact reward and scene randomization, crucial to the performance of the method. We demonstrate the generality of the approach through various examples, such as sitting on chairs, a couch and boxes, stepping over boxes, rocking a chair and turning an office chair. We believe these are some of the highest-quality results achieved for motion tracking from sparse sensor with scene interaction.

You may also like...

Paper