HMM-based Detection of Head Nods to Evaluate Conversational Engagement from Head Motion Data
PubDate: December 2021
Teams: University of California
Writers: Saygin Artiran; Leanne Chukoskie; Ara Jung; Ian Miller; Pamela Cosman
PDF: HMM-based Detection of Head Nods to Evaluate Conversational Engagement from Head Motion Data
Abstract
Head gestures such as head nodding and shaking play a prominent role in conversation, indicating active listening and interest in a conversation. We aim to create a tool to assess these physical conversational engagement cues in the context of a mock job interview. We propose a hidden Markov model-based architecture to locate and classify head nods and shakes from head motion data in an online fashion. Based on the number, velocity, and duration of the detected head gestures, we evaluate the conversational engagement level using a linear regression model. For the interview segments, high agreement was reached between model scores and scores from human raters. We consider this system as a path toward augmented reality and virtual reality-based training that can broaden participation in careers with competitive hiring scenarios.