Publikation

Realizing Multimodal Behavior

Michael Kipp, Alexis Heloir, Marc Schröder, Patrick Gebhard

In: Proceedings of the 10th International Conference on Intelligent Virtual Agents. International Conference on Intelligent Virtual Agents (IVA-2010) September 20-22 Philadelphia Pennsylvania United States Seiten 57-63 Springer 2010.

Abstrakt

Generating coordinated multimodal behavior for an embod- ied agent (speech, gesture, facial expression. . . ) is challenging. It requires a high degree of animation control, in particular when reactive behaviors are required. We suggest to distinguish realization planning, where ges- ture and speech are processed symbolically using the behavior markup language (BML), and presentation which is controlled by a lower level animation language (EMBRScript). Reactive behaviors can bypass plan- ning and directly control presentation. In this paper, we show how to define a behavior lexicon, how this lexicon relates to BML and how to resolve timing using formal constraint solvers. We conclude by demonstrating how to integrate reactive emotional behaviors.

Projekte

kipp_etal2010.pdf (pdf, 1 MB )

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence