Tactile glove-decode and classify the human grasping process

Note: We don't have the ability to review paper

PubDate: October 2021

Teams: Tianjin University

Writers: Zijian Cui; Tianshi Gao; Jiang Wang; Bin Deng

PDF: Tactile glove-decode and classify the human grasping process

Abstract

Tactile information can help the robot to complete tasks such as checking and controlling the placement of objects, identifying, and grasping objects. This is extremely important for realizing the robot’s external environment exploration and manipulation of objects. Robot grasping strategies based on computer vision have made considerable progress with the emergence of emerging machine learning tools and the enrichment of visual data sets. Compared with this, the current research on robot grasping behavior based on tactile is slightly insufficient. The reason is that there are few equivalent tactile sensor platforms and large-scale tactile data sets, which limits our understanding of the role of tactile information in the process of human grasping objects. In order to detect the pressure change of the human hand when grasping objects, we develop a high-density flexible tactile glove. The sensor array is composed of 807 independent piezoresistive sensing units. Through the grasping experiment, a large-scale tactile data set is obtained, and the shape and posture trajectories in the process of grasping the object are decoded using the PCA algorithm. Based on the convolutional neural network, a target recognition algorithm based on pressure distribution is designed. The detection test results of ordinary objects show that the recognition accuracy of ordinary objects exceeds 95%. Experiments verify that the high-density flexible tactile gloves and algorithms designed in this paper will help reveal the pressure change pattern when people grasp objects and help design future tactile feedback robots and smart prostheses.

You may also like...

Paper