Skip to main content Skip to main navigation


Solving the Sensor-based Activity Recognition Problem (SOAR): Self-supervised, Multi-modal Recognition of Activities from Wearable Sensors

Harish Haresamudram; Chi Ian Tang; Sungho Suh; Paul Lukowicz; Thomas Ploetz
In: ISWC '23: Proceedings of the 2023 ACM International Symposium on Wearable Computers. International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp-2023), October 8-12, Cancun, Mexico, ACM, 10/2023.


Feature extraction lies at the core of Human Activity Recognition (HAR): the automated inference of what activity is being performed. Traditionally, the HAR community used statistical metrics and distribution-based representations to summarize the movement present in windows of sensor data into feature vectors. More recently, learned representations have been used successfully in lieu of such handcrafted and manually engineered features. In particular, the community has shown substantial interest in self-supervised methods, which leverage large-scale unlabeled data to first learn useful representations that are subsequently fine-tuned to the target applications. In this tutorial, we focus on representations for single-sensor and multi-modal setups, and go beyond the current de facto of learning representations. We also discuss the economic use of existing representations, specifically via transfer learning and domain adaptation. The proposed tutorial will introduce state-of-the-art methods for representation learning in HAR, and provide a forum for researchers from mobile and ubiquitous computing to not only discuss the current state of the field but to also chart future directions for the field itself, including answering what it would take to finally solve the activity recognition problem.


Weitere Links