Museum Guide 2.0 - An Eye-Tracking based Personal Assistant for Museums and Exhibits
Takumi Toyama; Thomas Kieninger; Faisal Shafait; Andreas Dengel
In: L. Ciolfi; K. Scott; S. Barbieri (Hrsg.). Re-Thinking Technology in Museums 2011: Emerging Experiences. International Conference on Re-Thinking Technology in Museums, May 26-27, Limerick, Ireland, ISBN 1-905952-31-7 978-1-905952-31-1, University of Limerick, 5/2011.
This paper describes a new prototypical application that is based on a head mounted mobile eye tracker in combination with content based image retrieval technology. The application, named Museum Guide 2.0, acts like an unintrusive personal guide of a visitor in a museum. When it detects that the user is watching a specific art object, it will provide audio information on that specific object via earphones. The mobile eye tracker thereby observes the visitors eye movements and synchronizes the images of the scene camera with the detected eye fixations. The built in image retrieval subsystem recognizes which of the art objects in the exhibition is currently fixated by the users eyes (if any). Challenges that had to be faced during our research are the modifications of the retrieval process utilizing a given fixation for better accuracy, the detection of consciousness when looking at one specific object as trigger event for information delivery and to distinguish from noise (unconscious fixations). This paper focuses on the application aspect of Museum Guide 2.0. It describes how a database of given art objects is created from scratch and how the runtime application is to be used. We end with a user study that has been conducted to evaluate the acceptance of the system, specifically in contrast to conventional audioplayer based approaches.