Skip to main content Skip to main navigation


Interactive Fixation-to-AOI Mapping for Mobile Eye Tracking Data Based on Few-Shot Image Classification

Michael Barz; Omair Shahzad Bhatti; Hasan Md Tusfiqur Alam; Ho Minh Duy Nguyen; Daniel Sonntag
In: Companion Proceedings of the 28th International Conference on Intelligent User Interfaces. International Conference on Intelligent User Interfaces (IUI-2023), Sydney, NSW, Australia, Pages 175-178, IUI '23 Companion, ISBN 9798400701078, Association for Computing Machinery, 2023.


Mobile eye tracking is an important tool in psychology and human-centred interaction design for understanding how people process visual scenes and user interfaces. However, analysing recordings from mobile eye trackers, which typically include an egocentric video of the scene and a gaze signal, is a time-consuming and largely manual process. To address this challenge, we propose a web-based annotation tool that leverages few-shot image classification and interactive machine learning (IML) to accelerate the annotation process. The tool allows users to efficiently map fixations to areas of interest (AOI) in a video-editing-style interface. It includes an IML component that generates suggestions and learns from user feedback using a few-shot image classification model initialised with a small number of images per AOI. Our goal is to improve the efficiency and accuracy of fixation-to-AOI mapping in mobile eye tracking.


Weitere Links