Publication

Towards Recognising Collaborative Activities Using Multiple On-body Sensors

Jamie Ward, Gerald Pirkl, Peter Hevesi, Paul Lukowicz

In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp) New York, NY, USA Pages 221-224 UbiComp '16 ISBN 978-1-4503-4462-3 ACM 2016.

Abstract

This paper describes the initial stages of a new work on recognising collaborative activities involving two or more people. In the experiment described a physically demanding construction task is completed by a team of 4 volunteers. The task, to build a large video wall, requires communication, coordination, and physical collaboration between group members. Minimal outside assistance is provided to better reflect the ad-hoc and loosely structured nature of real-world construction tasks. On-body inertial measurement units (IMU) record each subject's head and arm movements; a wearable eye-tracker records gaze and ego-centric video; and audio is recorded from each person's head and dominant arm. A first look at the data reveals promising correlations between, for example, the movement patterns of two people carrying a heavy object. Also revealed are clues on how complementary information from different sensor types, such as sound and vision, might further aid collaboration recognition.

Weitere Links

German Research Center for Artificial Intelligence
Deutsches Forschungszentrum für Künstliche Intelligenz