Multimodal behavioral cues analysis of the sense of presence and co-presence during a social interaction with a virtual patient

Magalie Ochs, Jérémie Bousquet, Jean-Marie Pergandi, & Philippe Blache

Frontiers in Computer Science, 2022, 4:746804  @HAL

User’s experience evaluation is a key challenge when studying human-agent interaction. Besides user’s satisfaction, this question is addressed in virtual reality through the sense of presence and social presence, generally assessed thanks to subjective post-experience questionnaires. We propose in this article a novel approach making it possible to evaluate automatically these notions by correlating objective multimodal cues produced by users to their subjective sense of presence and social presence. This study is based on a multimodal human-agent interaction corpus collected in a task-oriented context: a virtual environment aiming at training doctors to break bad news to a patient played by a virtual agent. Based on a corpus study, we applied machine learning approaches to build a model predicting the user’s sense of presence and social presence thanks to specific multimodal behavioral cues. We explore different classification algorithms and machine learning techniques (oversampling and clustering) to cope with the dimensionality of the dataset and to optimize the prediction performance. We obtain models to automatically and accurately predict the level of presence and social presence. The results highlight the relevance of a multimodal model, based both on verbal and non-verbal cues as objective measures of (social) presence. The main contribution of the article is two-fold: 1/ proposing the first presence and social prediction presence models offering a way to automatically provide a user’s experience evaluation and 2/ showing the importance of multimodal information for describing these notions.

Posted in Featured publication.