Face recognition, as one of the most successful applications in artificial intelligence, has been widely used in security, administration, advertising, and healthcare. However, the privacy issues of public face datasets have attracted increasing attention in recent years. Previous works simply mask most areas of faces or synthesize samples using generative models to construct privacy-preserving face datasets, which overlooks the trade-off between privacy protection and data utility. In this paper, we propose a novel framework FaceMAE, where the face privacy and recognition performance are considered simultaneously. Firstly, randomly masked face images are used to train the reconstruction module in FaceMAE. We tailor the instance relation matching (IRM) module to minimize the distribution gap between real faces and FaceMAE reconstructed ones. During the deployment phase, we use trained FaceMAE to reconstruct images from masked faces of unseen identities without extra training. The risk of privacy leakage is measured based on face retrieval between reconstructed and original datasets. Experiments prove that the identities of reconstructed images are difficult to be retrieved. We also perform sufficient privacy-preserving face recognition on several public face datasets (i.e. CASIA-WebFace and WebFace260M). Compared to previous state of the arts, FaceMAE consistently \textbf{reduces at least 50\% error rate} on LFW, CFP-FP and AgeDB.
翻译:作为人工智能中最成功的应用之一,脸部识别是人工智能中最成功的应用之一,在安全、管理、广告和医疗保健中广泛使用。然而,公众脸部数据集的隐私问题近年来引起了越来越多的关注。以往的工作只是掩盖大部分面部或综合样本领域,使用基因模型构建隐私保护脸部数据集,忽视了隐私保护与数据实用性之间的权衡。在本文件中,我们提出了一个新的FaceMAE框架,其中将面部隐私和识别性能同时考虑。首先,随机使用面部遮盖图像来培训FaceMAE的重建模块。我们定制实例关系匹配(IRM)模块,以最大限度地缩小真实面部与FaceMAE重建后的数据之间的分布差距。在部署阶段,我们利用经过培训的FaceMAE将图像从隐蔽的面部面部与隐蔽性数据集进行重建,而无需额外培训。隐私渗漏风险的衡量依据在重建后的和原始数据集之间的面部检索。实验证明,重建后的图像身份难以检索。我们还对FaceMAE-FSA和FFA(I-FA-FA-FA-FA-FS)前一至FSerent Ref-FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_S_S_FA_FA_FA_FA_FA_FA_C_C_FD_FS_FS_FAR_FAR_C_C_C_FAR_FD_FD_FD_FD_FD_F_FA_FM_FA_FA_FD_FD_F_F_F_FA_FA_FA_F_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA_FA