Skip to main content Skip to main navigation


Gaze-guided Object Classification Using Deep Neural Networks for Attention-based Computing

Michael Barz; Daniel Sonntag
In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp-16), September 12-16, Heidelberg, Germany, Pages 253-256, ISBN 978-1-4503-4462-3, ACM, 9/2016.


Recent advances in eye tracking technologies opened the way to design novel attention-based user interfaces. This is promising for pro-active and assistive technologies for cyber-physical systems in the domains of, e.g., healthcare and industry 4.0. Prior approaches to recognize a user's attention are usually limited to the raw gaze signal or sensors in instrumented environments. We propose a system that (1) incorporates the gaze signal and the egocentric camera of the eye tracker to identify the objects the user focuses at; (2) employs object classification based on deep learning which we recompiled for our purposes on a GPU-based image classification server; (3) detects whether the user actually draws attention to that object; and (4) combines these modules for constructing episodic memories of egocentric events in real-time.

Weitere Links