空 挡 广 告 位 | 空 挡 广 告 位

MediaPipe Hands: On-device Real-time Hand Tracking

Note: We don't have the ability to review paper

PubDate: June 2020

Teams: Google Research

Writers: Fan Zhang, Valentin Bazarevsky, Andrey Vakunov, George Sung, Chuo-Ling Chang, Matthias Grundmann, Andrei Tkachenka

PDF: MediaPipe Hands: On-device Real-time Hand Tracking

Project: MediaPipe Hands: On-device Real-time Hand Tracking

Abstract

We present a real-time on-device hand tracking pipeline that predicts hand skeleton from single RGB camera for AR/VR applications. The pipeline consists of two models: 1) a palm detector, 2) a hand landmark model. It’s implemented via MediaPipe, a framework for building cross-platform ML solutions. The proposed model and pipeline architecture demonstrates real-time inference speed on mobile GPUs and high prediction quality. MediaPipe Hands is open source at https://mediapipe.dev.

您可能还喜欢...

Paper