Combining Touch and Gaze for Distant Selection in a Tabletop Setting
Tabletop interaction with objects in and out of reach is a common real world as well as virtual task. Gaze as additional input modality might support this interactions on tabletops in terms of search, selection and manipulation of distant objects. The aim of this work is to design and evaluate an interaction technique that relies on gaze and gestural touch input for the selection of distant objects. The proposed approach makes objects that are out of physical reach easily available to the user, and aims to provide an increased selection accuracy compared to single modality approaches. The paper contributes a setup that allows to track people with a static eye-tracker in front of a tabletop and investigates an interaction technique that makes use of the flicking gesture augmented by gaze information to select distant objects.