Selecting the Motion Ground Truth for Loose-fitting Wearables: Benchmarking Optical MoCap MethodsLala Shakti Swarup Ray; Bo Zhou; Sungho Suh; Paul Lukowicz
In: Proceedings of the 2023 International Symposium on Wearable Computers. ACM International Symposium on Wearable Computers (ISWC-2023), October 8-12, Cancun, Mexico, Pages 27-32, ACM, 2023.
To aid smart wearable researchers in selecting optimal ground truth methods for motion capture (MoCap) across all loose garment types, we introduce a benchmark: DrapeMoCapBench (DMCB). This benchmark is tailored to assess optical marker-based and marker-less MoCap performance. While high-cost marker-based systems are recognized as precise standards, they demand skin-tight markers on bony areas for accuracy, which is problematic with loose garments. Conversely, marker-less MoCap methods driven by computer vision models have evolved, requiring only smartphone cameras and being cost-effective. DMCB employs real-world MoCap datasets, conducting 3D physics simulations with diverse variables: six drape levels, three motion intensities, and six body type-gender combinations. This benchmarks advanced marker-based and marker-less MoCap techniques, identifying the superior approach for distinct scenarios. When evaluating casual loose garments, both methods exhibit notable performance degradation (>10cm). However, for everyday activities involving basic and swift motions, marker-less MoCap slightly surpasses marker-based alternatives. This renders it an advantageous and economical choice for wearable studies.