雨果巴拉:行业北极星Vision Pro过度设计不适合市场

A User Perception–Based Approach to Create Smiling Embodied Conversational Agents

Note: We don't have the ability to review paper

PubDate: January 2017

Teams: Aix Marseille Université;CNRS-LTCI Télécom ParisTech;Queen’s University of Belfast

Writers: Magalie Ochs;Catherine Pelachaud;Gary Mckeown

PDF: A User Perception–Based Approach to Create Smiling Embodied Conversational Agents

Abstract

In order to improve the social capabilities of embodied conversational agents, we propose a computational model to enable agents to automatically select and display appropriate smiling behavior during human–machine interaction. A smile may convey different communicative intentions depending on subtle characteristics of the facial expression and contextual cues. To construct such a model, as a first step, we explore the morphological and dynamic characteristics of different types of smiles (polite, amused, and embarrassed smiles) that an embodied conversational agent may display. The resulting lexicon of smiles is based on a corpus of virtual agents’ smiles directly created by users and analyzed through a machine-learning technique. Moreover, during an interaction, a smiling expression impacts on the observer’s perception of the interpersonal stance of the speaker. As a second step, we propose a probabilistic model to automatically compute the user’s potential perception of the embodied conversational agent’s social stance depending on its smiling behavior and on its physical appearance. This model, based on a corpus of users’ perceptions of smiling and nonsmiling virtual agents, enables a virtual agent to determine the appropriate smiling behavior to adopt given the interpersonal stance it wants to express. An experiment using real human–virtual agent interaction provided some validation of the proposed model.

您可能还喜欢...

Paper