MediaPipe Hands: On-device Real-time Hand Tracking
PubDate: June 2020
Teams: Google Research
Writers: Fan Zhang, Valentin Bazarevsky, Andrey Vakunov, George Sung, Chuo-Ling Chang, Matthias Grundmann, Andrei Tkachenka
PDF: MediaPipe Hands: On-device Real-time Hand Tracking
Project: MediaPipe Hands: On-device Real-time Hand Tracking
Abstract
We present a real-time on-device hand tracking pipeline that predicts hand skeleton from single RGB camera for AR/VR applications. The pipeline consists of two models: 1) a palm detector, 2) a hand landmark model. It’s implemented via MediaPipe, a framework for building cross-platform ML solutions. The proposed model and pipeline architecture demonstrates real-time inference speed on mobile GPUs and high prediction quality. MediaPipe Hands is open source at https://mediapipe.dev.