Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses
PubDate: October 2021
Teams: Korea Institute of Science and Technology;Hanyang University
Writers: Jangho Kwon; Jihyeon Ha; Da-Hye Kim; Jun Won Choi; Laehyun Kim
We present a glasses-type wearable device to detect emotions from a human face in an unobtrusive manner. The device is designed to gather multi-channel responses from the user’s face naturally and continuously while he/she is wearing it. The multi-channel facial responses consist of local facial images and biosignals including electrodermal activity (EDA) and photoplethysmogram (PPG). We had conducted experiments to determine the optimal positions of EDA sensors on the wearable device because EDA signal quality is very sensitive to the sensing position. In addition to the physiological data, the device can capture the image region representing local facial expressions around the left eye via a built-in camera. In this study, we developed and validated an algorithm to recognize emotions using multi-channel responses obtained from the device. The results show that the emotion recognition algorithm using only local facial images has an accuracy of 76.09% at classifying emotions. Using multi-channel data including EDA and PPG, this accuracy was increased by 8.46% compared to using the local facial expression alone. This glasses-type wearable system measuring multi-channel facial responses in a natural manner is very useful for monitoring a user’s emotions in daily life, which has a huge potential for use in the healthcare industry.