Human motion is a biomarker used extensively in clinical analysis to monitor the progression of neurological diseases and mood disorders. Since perceptions of emotions are also interleaved with body posture and movements, emotion recognition from human gait can be used to quantitatively monitor mood changes that are often related to neurological disorders. Many existing solutions often use shallow machine learning models with raw positional data or manually extracted features to achieve this. However, gait is composed of many highly expressive characteristics that can be used to identify human subjects, and most solutions fail to address this, disregarding the subject's privacy. This work evaluates the effectiveness of existing methods at recognising emotions using both 3D temporal joint signals and manually extracted features. We also show that this data can easily be exploited to expose a subject's identity. Therefore to this end, we propose a cross-subject transfer learning technique for training a multi-encoder autoencoder deep neural network to learn disentangled latent representations of human motion features. By disentangling subject biometrics from the gait data, we show that the subjects privacy is preserved while the affect recognition performance outperforms traditional methods.
翻译:人类运动是一种生物标志,在临床分析中广泛用于监测神经疾病和情绪障碍的演变。由于情感感知也与身体的态势和运动相互交错,人类运动的情绪识别可用于定量监测情绪变化,而这种变化往往与神经系统疾病有关。许多现有解决方案经常使用浅机器学习模型,包括原始定位数据或人工提取的特征来实现这一目标。然而,运动是由许多高度直观的特征组成的,可用来识别人体的实验对象,而大多数解决方案未能解决这个问题,无视主题的隐私。这项工作评估了使用3D时间联合信号和手动提取特征识别情绪的现有方法的有效性。我们还表明,这些数据很容易被利用来暴露一个对象的身份。因此,我们为此建议采用跨主题转移学习技术,用于培训多分解器的深度电解导神经网络,以学习人类运动特征的分解的潜伏表象。通过将主题生物鉴别数据从游戏数据中分离出来,我们表明,在影响认知性表现超越传统方法的同时,保护了主题的隐私。