Real-time Pupil Tracking from Monocular Video for Digital Puppetry
PubDate: June 2020
Teams: Google Research
Writers: Artsiom Ablavatski, Andrey Vakunov, Ivan Grischenko, Karthik Raveendran, Matsvei Zhdanovich, Matthias Grundmann
PDF: Real-time Pupil Tracking from Monocular Video for Digital Puppetry
Abstract
We present a simple, real-time approach for pupil tracking from live video on mobile devices. Our method extends a state-of-the-art face mesh detector with two new components: a tiny neural network that predicts positions of the pupils in 2D, and a displacement-based estimation of the pupil blend shape coefficients. Our technique can be used to accurately control the pupil movements of a virtual puppet, and lends liveliness and energy to it. The proposed approach runs at over 50 FPS on modern phones, and enables its usage in any real-time puppeteering pipeline.