REAL: Situated Dialogues in Instrumented Environments

Christoph Stahl, Jörg Baus, Antonio Krüger, Dominikus Heckmann, Rainer Wasinger, Michael Schneider

In: Workshop on Invisible and Transparent Interfaces (ITI) at AVI. AVI Workshop on Invisible and Transparent Interfaces (ITI) AVI Gallipoli Pages 10-15 2004.


We give a survey of the research project REAL, where we investigate how a system can proactively assist its user in solving different tasks in an instrumented environment by sensing implicit interaction and utilising distributed presentation media. First we introduce the architecture of our instrumented environment, which uses a blackboard to coordinate the components of the environment, such as the sensing and positioning services and interaction devices. A ubiquitous user model provides contextual information on the users characteristics, actions and locations. The user may access and control their profile via a web interface. In the following, we present two mobile applications to employ the environmental support for situated dialogues, a shopping assistant and a pedestrian navigation system. Both applications allow for multi-modal interaction through a combination of speech, gesture and sensed actions such as motion.

German Research Center for Artificial Intelligence
Deutsches Forschungszentrum für Künstliche Intelligenz