Skip to main content Skip to main navigation

Publikation

Let there be IMU data: generating training data for wearable, motion sensor based activity recognition from monocular RGB videos

Vitor Fortes Rey; Peter Hevesi; Onorina Kovalenko; Paul Lukowicz
In: UbiComp/ISWC '19 Adjunct Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers. International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp-2019), September 9-14, London, United Kingdom, Pages 699-708, Vol. '19, ACM, 9/2019.

Zusammenfassung

Recent advances in Machine Learning, in particular Deep Learning have been driving rapid progress in fields such as computer vision and natural language processing. Human activity recognition (HAR) using wearable sensors, which has been a thriving research field for the last 20 years, has benefited much less from such advances. This is largely due to the lack of adequate amounts of labeled training data. In this paper we propose a method to mitigate the labeled data problem in wearable HAR by generating wearable motion data from monocular RGB videos, which can be collected from popular video platforms such as YouTube. Our method works by extracting 2D poses from video frames and then using a regression model to map them to sensor signals. We have validated it on fitness exercises as the domain for which activity recognition is trained and shown that we can improve a HAR recognition model using data that was produced from a YouTube video.

Weitere Links