Sign Language Avatars: Animation and Comprehensibility

Michael Kipp; Alexis Heloir; Quan Nguyen

In: Proceedings of the 11th International Conference on Intelligent Virtual Agents. International Conference on Intelligent Virtual Agents (IVA-11), 11th, September 15-17, Reykjavik, Iceland, Springer, 2011.


Many deaf people have significant reading problems. Written content, e.g. on internet pages, is therefore not fully accessible for them. Embodied agents have the potential to communicate in the native language of this cultural group: sign language. However, state-of-the-art systems have limited comprehensibility and standard evaluation methods are missing. In this paper, we present methods and discuss challenges for creating and evaluating a signing avatar. We extended the existing EMBR character animation system with prerequisite functionality, created a gloss-based animation tool and developed a cyclic content creation workflow with the help of two deaf sign language experts. For evaluation, we introduce delta testing, a novel way of assessing comprehensibility by comparing avatars with human signers. While our system reached state-of-the-art comprehensibility in a short development time we argue that future research needs to focus on nonmanual aspects and prosody to reach the comprehensibility levels of human signers.

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence