Human motion characteristics are used to monitor the progression of neurological diseases and mood disorders. Since perceptions of emotions are also interleaved with body posture and movements, emotion recognition from human gait can be used to quantitatively monitor mood changes. Many existing solutions often use shallow machine learning models with raw positional data or manually extracted features to achieve this. However, gait is composed of many highly expressive characteristics that can be used to identify human subjects, and most solutions fail to address this, disregarding the subject's privacy. This work introduces a novel deep neural network architecture to disentangle human emotions and biometrics. In particular, we propose a cross-subject transfer learning technique for training a multi-encoder autoencoder deep neural network to learn disentangled latent representations of human motion features. By disentangling subject biometrics from the gait data, we show that the subject's privacy is preserved while the affect recognition performance outperforms traditional methods. Furthermore, we exploit Guided Grad-CAM to provide global explanations of the model's decision across gait cycles. We evaluate the effectiveness of our method to existing methods at recognizing emotions using both 3D temporal joint signals and manually extracted features. We also show that this data can easily be exploited to expose a subject's identity. Our method shows up to 7% improvement and highlights the joints with the most significant influence across the average gait cycle.
翻译:人类运动特性被用来监测神经疾病和情绪障碍的演变过程。由于情感感知与身体姿态和运动相互交织,因此人类运动的情绪感知可用于对情绪变化进行定量监测。许多现有解决方案经常使用带有原始定位数据的浅机器学习模型或人工提取的特征来实现这一点。然而,运动特征由许多高度直观的特征组成,可用于识别人类主体,而大多数解决方案无法解决这一点,忽视了该主体的隐私。这项工作引入了一个新的深层神经网络架构,以混淆人类情感和生物鉴别学。特别是,我们建议了一种跨主题传输学习技术,用于培训多编码自动编码器的深度神经网络,以学习人类运动特征的分解潜在表现。通过将主题生物测定数据脱钩,我们表明该主题的隐私得到了保留,而该认知性表现则超越了传统方法。此外,我们利用制导的 Grad-CAM为模型决定提供全周期混淆人类情感和生物生物鉴别学的全局解释。我们评估了我们现有方法的有效性,以多种方法来了解不同潜在特征,我们还可以通过利用平均时间信号来展示一个显著的图像。