空 挡 广 告 位 | 空 挡 广 告 位

SpecTracle: Wearable Facial Motion Tracking from Unobtrusive Peripheral Cameras

Note: We don't have the ability to review paper

PubDate: Aug 2023

Teams: University of California

Writers: Yinan Xuan, Varun Viswanath, Sunny Chu, Owen Bartolf, Jessica Echterhoff, Edward Wang

PDF: SpecTracle: Wearable Facial Motion Tracking from Unobtrusive Peripheral Cameras

Abstract

Facial motion tracking in head-mounted displays (HMD) has the potential to enable immersive “face-to-face” interaction in a virtual environment. However, current works on facial tracking are not suitable for unobtrusive augmented reality (AR) glasses or do not have the ability to track arbitrary facial movements. In this work, we demonstrate a novel system called SpecTracle that tracks a user’s facial motions using two wide-angle cameras mounted right next to the visor of a Hololens. Avoiding the usage of cameras extended in front of the face, our system greatly improves the feasibility to integrate full-face tracking into a low-profile form factor. We also demonstrate that a neural network-based model processing the wide-angle cameras can run in real-time at 24 frames per second (fps) on a mobile GPU and track independent facial movement for different parts of the face with a user-independent model. Using a short personalized calibration, the system improves its tracking performance by 42.3% compared to the user-independent model.

您可能还喜欢...

Paper