Inertial sensor-based human activity recognition (HAR) is the base of many human-centered mobile applications. Deep learning-based fine-grained HAR models enable accurate classification in various complex application scenarios. Nevertheless, the large storage and computational overhead of the existing fine-grained deep HAR models hinder their widespread deployment on resource-limited platforms. Inspired by the knowledge distillation's reasonable model compression and potential performance improvement capability, we design a multi-level HAR modeling pipeline called Stage-Logits-Memory Distillation (SMLDist) based on the widely-used MobileNet. By paying more attention to the frequency-related features during the distillation process, the SMLDist improves the HAR classification robustness of the students. We also propose an auto-search mechanism in the heterogeneous classifiers to improve classification performance. Extensive simulation results demonstrate that SMLDist outperforms various state-of-the-art HAR frameworks in accuracy and F1 macro score. The practical evaluation of the Jetson Xavier AGX platform shows that the SMLDist model is both energy-efficient and computation-efficient. These experiments validate the reasonable balance between the robustness and efficiency of the proposed model. The comparative experiments of knowledge distillation on six public datasets also demonstrate that the SMLDist outperforms other advanced knowledge distillation methods of students' performance, which verifies the good generalization of the SMLDist on other classification tasks, including but not limited to HAR.
翻译:深入的基于学习的精细粗度的HAR模型能够在各种复杂的应用情景中进行准确的分类。然而,现有精细粗度的HAR模型的庞大储存和计算间接费用妨碍了其在资源有限的平台上的广泛部署。在知识蒸馏的合理模型压缩和潜在性能改进能力的启发下,我们根据广泛使用的移动网络设计了一个多层次的HAR模型管线,称为阶段-液-水蒸馏(SMLDist) 。通过在蒸馏过程中更多地注意与频率有关的特征,SMLDist 能够提高学生的稳健性能。我们还提议在多样化的分类器中建立一个自动研究机制,以提高分类性能。广泛的模拟结果显示,SMLDist在准确性和F1宏观评分方面超越了各种州-工艺的HAR框架。Jectson Xavier AGX平台的实际评估显示,SMLD模型的频率相关功能是稳健的测试, 包括比较性能测算法的其他测试方法。