Publikation

User independent, multi-modal spotting of subtle arm actions with minimal training data

Gerald Bauer, Ulf Blanke, Paul Lukowicz, Bernt Schiele

In: Pervasive Computing and Communications Workshops (PERCOM Workshops), 2013 IEEE International Conference on. IEEE International Conference on Pervasive Computing and Communications (PerCom-2013) March 18-22 San Diego California United States Seiten 8-13 ISBN 978-1-4673-5075-4 IEEE 2013.

Abstrakt

We address a specific, particularly difficult class of activity recognition problems defined by (1) subtle, and hardly discriminative hand motions such as a short press or pull, (2) large, ill defined NULL class (any other hand motion a person may express during normal life), and (3) difficulty of collecting sufficient training data, that generalizes well from one to multiple users. In essence we intend to spot activities such as opening a cupboard, pressing a button, or taking an object from a shelve in a large data stream that contains typical every day activity. We focus on body-worn sensors without instrumenting objects, we exploit available infrastructure information, and we perform a one-to-many-users training scheme for minimal training effort. We demonstrate that a state of the art motion sensors based approach performs poorly under such conditions (Equal Error Rate of 18% in our experiments). We present and evaluate a new multi modal system based on a combination of indoor location with a wrist mounted proximity sensor, camera and inertial sensor that raises the EER to 79%.

Weitere Links

User_Independent,_Multi-Modal_Spotting_of_Subtle_Arm_Actions_with_Minimal.pdf (pdf, 3 MB )

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence