Skeleton-based action recognition attracts practitioners and researchers due to the lightweight, compact nature of datasets. Compared with RGB-video-based action recognition, skeleton-based action recognition is a safer way to protect the privacy of subjects while having competitive recognition performance. However, due to improvements in skeleton estimation algorithms as well as motion- and depth-sensors, more details of motion characteristics can be preserved in the skeleton dataset, leading to potential privacy leakage. To investigate the potential privacy leakage from skeleton datasets, we first train a classifier to categorize sensitive private information from trajectories of joints. Our preliminary experiments show that the gender classifier achieves 87% accuracy on average and the re-identification task achieves 80% accuracy on average for three baseline models: Shift-GCN, MS-G3D, and 2s-AGCN. We propose an adversarial anonymization algorithm to protect potential privacy leakage from the skeleton dataset. Experimental results show that an anonymized dataset can reduce the risk of privacy leakage while having marginal effects on action recognition performance.
翻译:由于数据集的轻量级和紧凑性,基于Skeleton的行动识别吸引了从业人员和研究人员。与基于 RGB 的视频行动识别相比,基于骨骼的行动识别是一种在具有竞争性承认性的同时保护主体隐私的更安全的方法。然而,由于骨架估计算法以及运动和深度传感器的改进,可以在骨架数据集中保留更多关于运动特征的细节,从而可能导致隐私泄漏。为了调查骨架数据集潜在的隐私渗漏,我们首先训练一个分类器,对来自联合体轨迹的敏感私人信息进行分类。我们的初步实验显示,性别分类器平均达到87%的准确度,重新认定任务对以下三个基线模型平均达到80%的准确度: Shift-GCN、MS-G3D和2s-AGCN。我们提议了一种对抗性匿名算法,以保护从骨架数据集的潜在隐私渗漏。实验结果表明,匿名数据集可以降低隐私渗漏风险,同时对行动识别性产生边际效应。