雨果巴拉:行业北极星Vision Pro过度设计不适合市场

3D Face Reconstruction and Gaze Tracking in the HMD for Virtual Interaction

Note: We don't have the ability to review paper

PubDate: March 2022

Teams: Institute of Computing Technology Chinese Academy of Sciences;Cardiff University

Writers: Shu-Yu Chen; Yu-Kun Lai; Shihong Xia; Paul Rosin; Lin Gao

PDF: 3D Face Reconstruction and Gaze Tracking in the HMD for Virtual Interaction

Abstract

With the rapid development of virtual reality (VR) technology, VR headsets, a.k.a. Head-Mounted Displays (HMDs), are widely available, allowing immersive 3D content to be viewed. A natural need for truly immersive VR is to allow bidirectional communication: the user should be able to interact with the virtual world using facial expressions and eye gaze, in addition to traditional means of interaction. The typical application scenario includes VR virtual conferencing and virtual roaming, where ideally users are able to see other users expressions and have eye contact with them in the virtual world. In addition, eye gaze also provides a natural means of interaction with virtual objects. Despite significant achievements in recent years for reconstruction of 3D faces from RGB or RGB-D images, it remains a challenge to reliably capture and reconstruct 3D facial expressions including eye gaze when the user is wearing an HMD, because the majority of the face is occluded, especially those areas around the eyes which are essential for recognizing facial expressions and eye gaze. In this paper, we introduce a novel real-time system that is able to capture and reconstruct 3D faces wearing HMDs, and robustly recover eye gaze. We further propose a novel method to map eye gaze directions to the 3D virtual world, which provides a novel and useful interactive mode in VR. We compare our method with state of-the-art techniques both qualitatively and quantitatively, and demonstrate the effectiveness of our system using live capture.

您可能还喜欢...

Paper