雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Realtime Hand-Object Interaction Using Learned Grasp Space for Virtual Environments

Note: We don't have the ability to review paper

PubDate: June 2018

Teams: East China Normal University;University of Maryland

Writers: Hao Tian; Changbo Wang; Dinesh Manocha; Xinyu Zhang

PDF: Realtime Hand-Object Interaction Using Learned Grasp Space for Virtual Environments

Abstract

We present a realtime virtual grasping algorithm to model interactions with virtual objects. Our approach is designed for multi-fingered hands and makes no assumptions about the motion of the user’s hand or the virtual objects. Given a model of the virtual hand, we use machine learning and particle swarm optimization to automatically pre-compute stable grasp configurations for that object. The learning pre-computation step is accelerated using GPU parallelization. At runtime, we rely on the pre-computed stable grasp configurations, and dynamics/non-penetration constraints along with motion planning techniques to compute plausible looking grasps. In practice, our realtime algorithm can perform virtual grasping operations in less than 20ms for complex virtual objects, including high genus objects with holes. We have integrated our grasping algorithm with Oculus Rift HMD and Leap Motion controller and evaluated its performance for different tasks corresponding to grabbing virtual objects and placing them at arbitrary locations. Our user evaluation suggests that our virtual grasping algorithm can increase the user’s realism and participation in these tasks and offers considerable benefits over prior interaction algorithms, such as pinch grasping and raycast picking.

您可能还喜欢...

Paper