Human motion is fundamentally driven by continuous physical interaction with the environment. Whether walking, running, or simply standing, the forces exchanged between our feet and the ground provide crucial insights for understanding and reconstructing human movement. Recent advances in wearable insole devices offer a compelling solution for capturing these forces in diverse, real-world scenarios. Sensor insoles pose no constraint on the users' motion (unlike mocap suits) and are unaffected by line-of-sight limitations (in contrast to optical systems). These qualities make sensor insoles an ideal choice for robust, unconstrained motion capture, particularly in outdoor environments. Surprisingly, leveraging these devices with recent motion reconstruction methods remains largely unexplored. Aiming to fill this gap, we present Step2Motion, the first approach to reconstruct human locomotion from multi-modal insole sensors. Our method utilizes pressure and inertial data-accelerations and angular rates-captured by the insoles to reconstruct human motion. We evaluate the effectiveness of our approach across a range of experiments to show its versatility for diverse locomotion styles, from simple ones like walking or jogging up to moving sideways, on tiptoes, slightly crouching, or dancing.
翻译:人体运动本质上是由与环境的持续物理交互所驱动的。无论是行走、奔跑还是单纯站立,脚部与地面之间交换的力为理解和重建人体运动提供了关键信息。可穿戴鞋垫设备的最新进展为在多样化真实场景中捕捉这些力提供了极具前景的解决方案。传感鞋垫不会限制使用者的运动(与动作捕捉服不同),且不受视线遮挡的影响(与光学系统相比)。这些特性使传感鞋垫成为鲁棒、无约束运动捕捉的理想选择,尤其在户外环境中。值得注意的是,将此类设备与现有运动重建方法相结合的研究仍基本处于空白。为填补这一空白,我们提出了Step2Motion——首个基于多模态鞋垫传感器重建人体步态运动的方法。该方法利用鞋垫采集的压力与惯性数据(加速度与角速度)来重建人体运动。我们通过一系列实验评估了该方法的有效性,证明了其对多种步态风格的普适性,涵盖从行走、慢跑等简单动作,到侧向移动、踮脚行走、微蹲乃至舞蹈等复杂动作。