Using hands and feet to navigate and manipulate spatial data

Johannes Schöning, Florian Daiber, Antonio Krüger, Michael Rohs

In: CHI EA '09: Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems. International Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA-09) April 4-9 Boston MA United States Seiten 4663-4668 ISBN 978-1-60558-247-4 ACM 2009.


We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multitouch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.

Weitere Links

UsingHandsandFeetNavigateandmanipulateSpatialData.pdf (pdf, 693 KB )

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence