Towards Relational POMDPs for Adaptive Dialogue Management

Pierre Lison

In: Proceeding of the Student Research Workshop of the 48th Annual Meeting of the Association for Computational Linguistics. Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (ACL-HLT-10) July 12-14 Uppsala Sweden Association for Computational Linguistics 2010.


Open-ended spoken interactions are typically characterised by both structural complexity and high levels of uncertainty, making dialogue management in such settings a particularly challenging problem. Traditional approaches have focused on providing theoretical accounts for either the uncertainty or the complexity of spoken dialogue, but rarely considered the two issues in tandem. This paper describes ongoing work on a new approach to dialogue management which attempts to fill this gap. We represent the interaction as a Partially Observable Markov Decision Process (POMDP) over a rich state space incorporating both dialogue, user, and environment models. The tractability of the resulting POMDP can be preserved using a mechanism for dynamically constraining the action space based on prior knowledge over locally relevant dialogue structures. These constraints are encoded in a small set of general rules expressed as a Markov Logic network. The first-order expressivity of Markov Logic enables us to leverage the rich relational structure of the problem and efficiently abstract over large regions of the state and action spaces.


main.relpomdps.acl2010.pdf (pdf, 314 KB )

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence