Detecting Physical Collaborations in a Group Task Using Body-Worn Microphones and Accelerometers

Jamie A Ward; Paul Lukowicz; Gerald Pirkl; Peter Hevesi

In: 13th Workshop on Context and Activity Modeling and Recognition (CoMoRea'17). IEEE International Conference on Pervasive Computing and Communications (PerCom), Big Island, USA, IEEE, 2017.


This paper presents a method of using wearable accelerometers and microphones to detect instances of ad-hoc physical collaborations between members of a group. 4 people are instructed to construct a large video wall and must cooperate to complete the task. The task is loosely structured with minimal outside assistance to better reflect the ad-hoc nature of many real world construction scenarios. Audio data, recorded from chest-worn microphones, is used to reveal information on collocation, i.e. whether or not participants are near one another. Movement data, recorded using 3-axis accelerometers worn on each person's head and wrists, is used to provide information on correlated movements, such as when participants help one another to lift a heavy object. Collocation and correlated movement information is then combined to determine who is working together at any given time. The work shows how data from commonly available sensors can be combined across multiple people using a simple, low power algorithm to detect a range of physical collaborations.

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence