Publication
eyeNotate: Interactive Annotation of Mobile Eye Tracking Data Based on Few-Shot Image Classification
Michael Barz; Omair Shahzad Bhatti; Hasan Md Tusfiqur Alam; Ho Minh Duy Nguyen; Kristin Altmeyer; Sarah Malone; Daniel Sonntag
In: Journal of Eye Movement Research (JEMR), Vol. 18, No. 4, Pages 1-35, MDPI, 7/2025.
Abstract
Mobile eye tracking is an important tool in psychology and human-centered interaction design for understanding how people process visual scenes and user interfaces. However, analyzing recordings from head-mounted eye trackers, which typically include an egocentric video of the scene and a gaze signal, is a time-consuming and largely manual process. To address this challenge, we develop eyeNotate, a web-based annotation tool that enables semi-automatic data annotation and learns to improve from corrective user feedback. Users can manually map fixation events to areas of interest (AOIs) in a video-editing-style interface (baseline version). Further, our tool can generate fixation-to-AOI mapping suggestions based on a few-shot image classification model (IML-support version). We conduct an expert study with trained annotators (n = 3) to compare the baseline and IML-support versions. We measure the perceived usability, annotations' validity and reliability, and efficiency during a data annotation task. We asked our participants to re-annotate data from a single individual using an existing dataset (n = 48). Further, we conducted a semi-structured interview to understand how participants used the provided IML features and assessed our design decisions. In a post hoc experiment, we investigate the performance of three image classification models in annotating data of the remaining 47 individuals.
Projects
- MASTER - MASTER: Mixed reality ecosystem for teaching robotics in manufacturing
- No-IDLE - Interactive Deep Learning Enterprise