Real-time Human Motion Forecasting using a RGB Camera
PubDate: August 2019
Teams: Tokyo Kogyo Daigaku
Writers: Erwin Wu; Hideki Koike
PDF: Real-time Human Motion Forecasting using a RGB Camera
Abstract
This paper propose a real-time human motion forecasting system which visualize the future pose in virtual reality using a RGB camera. Our system consists of three parts: 2D pose estimation from RGB frames using a residual neural network, 2D pose forecasting using a recurrent neural network, and 3D recovery from the predicted 2D pose using a residual linear network. To improve the prediction learning quantity of temporal feature, we propose a special method using lattice optical flow for the joints movement estimation. After fitting the skeleton, a predicted 3d model of target human will be built 0.5s in advance in a 30-fps video.