Skip to main content Skip to main navigation


ConAn: A Usable Tool for Multimodal Conversation Analysis

Anna Penzkofer; Philipp Müller; Felix Bühler; Sven Mayer; Andreas Bulling
In: Proceedings of the 2021 ACM International Conference on Multimodal Interaction. ACM International Conference on Multimodal Interaction (ICMI-2021), Pages 341-351, ACM, 2021.


Multimodal analysis of group behavior is a key task in human-computer interaction, and in the social and behavioral sciences, but is often limited to more easily controllable laboratory settings or requires elaborate multi-sensor setups and time-consuming manual data annotation. We present ConAn – a usable tool to explore and automatically analyze non-verbal behavior of multiple persons during natural group conversations. In contrast to traditional multi-sensor setups, our tool only requires a single 360° camera and uses state-of-the-art computer vision methods to automatically extract behavioral indicators, such as gaze direction, facial expressions, and speaking activity. As such, our tool allows for easy and fast deployment and supports researchers in understanding individual behavior, group interaction dynamics, and in quantifying user-object interactions. We illustrate the benefits of ConAn on three sample use cases: conversation analysis, assessment of collaboration quality, and impact of technology on audience behavior. Taken together, ConAn represents an important step towards democratizing automatic conversation analysis in HCI and beyond.


Weitere Links