Interpersonal stance recognition using non-verbal signals on several time windows

Abstract

We present a computational model for interpreting nonverbal signals of a user during an interaction with a virtual character in order to obtain a representation of his interpersonal stance. Our model starts, on the one hand, from the analysis of multimodal signals. On the other hand, it takes into account the temporal patterns of the interactants behaviors. That is it analyses signals and reactions to signals in their immediate context, as well as features of signal production patterns and reaction patterns on different time windows : signal reaction, sentence reaction, conversation topic, whole interaction. In this paper, we propose a first model parameterized using data obtained from the literature on the expressions of stances through interpersonal behavior.

Publication
Workshop Affect, Compagnon Artificiel, Interaction