Learning discriminative face features plays a major role in building high-performing face recognition models. The recent state-of-the-art face recognition solutions proposed to incorporate a fixed penalty margin on commonly used classification loss function, softmax loss, in the normalized hypersphere to increase the discriminative power of face recognition models, by minimizing the intra-class variation and maximizing the inter-class variation. Marginal softmax losses, such as ArcFace and CosFace, assume that the geodesic distance between and within the different identities can be equally learned using a fixed margin. However, such a learning objective is not realistic for real data with inconsistent inter-and intra-class variation, which might limit the discriminative and generalizability of the face recognition model. In this paper, we relax the fixed margin constrain by proposing elastic margin loss (ElasticFace) that allows flexibility in the push for class separability. The main idea is to utilize random margin values drawn from a normal distribution in each training iteration. This aims at giving the margin chances to extract and retract to allow space for flexible class separability learning. We demonstrate the superiority of our elastic margin loss over ArcFace and CosFace losses, using the same geometric transformation, on a large set of mainstream benchmarks. From a wider perspective, our ElasticFace has advanced the state-of-the-art face recognition performance on six out of nine mainstream benchmarks.
翻译:学习歧视面貌特征在建立高性能的面部识别模型方面起着重要作用。最近提出的最先进的面部识别解决方案是,在常用分类损失功能和内部差异不一致的真实数据中加入固定的罚款差值(软负损失),以通过最大限度地减少阶级内部差异和最大限度地扩大阶级间差异来增加面部识别模型的歧视性力量。ArcFace和CosFace等边际软体损失假设,不同身份之间和内部的大地差值可以使用固定差值来同等学习。然而,对于不同类别之间和内部差异不一致的真实数据来说,这种学习目标并不现实,这可能限制面部识别模型的歧视性和普遍性。在本文中,我们通过提出弹性差值损失(Elactiface Face)来放松固定的差值限制,从而能够灵活地推动阶级的兼容性。主要想法是利用每次培训的正常面部分布中得出的随机差值。这是为了给边际机会提取和翻转回空间,以便灵活地从阶级之间和内部差异变异,这可能会限制面模型的可限制。在脸部识别模型模型模型模型上,我们进行大性变的高度的底底底底底线。