空 挡 广 告 位 | 空 挡 广 告 位

EgoTouch: On-Body Touch Input Using AR/VR Headset Cameras

Note: We don't have the ability to review paper

PubDate: Oct 2024

Teams: Carnegie Mellon University

Writers: Vimal Mollyn, Chris Harrison

PDF: EgoTouch: On-Body Touch Input Using AR/VR Headset Cameras

Abstract

In augmented and virtual reality (AR/VR) experiences, a user’s arms and hands can provide a convenient and tactile surface for touch input. Prior work has shown on-body input to have significant speed, accuracy, and ergonomic benefits over in-air interfaces, which are common today. In this work, we demonstrate high accuracy, bare hands (i.e., no special instrumentation of the user) skin input using just an RGB camera, like those already integrated into all modern XR headsets. Our results show this approach can be accurate, and robust across diverse lighting conditions, skin tones, and body motion (e.g., input while walking). Finally, our pipeline also provides rich input metadata including touch force, finger identification, angle of attack, and rotation. We believe these are the requisite technical ingredients to more fully unlock on-skin interfaces that have been well motivated in the HCI literature but have lacked robust and practical methods.

您可能还喜欢...

Paper