Learning discriminative face features plays a major role in building high-performing face recognition models. The recent state-of-the-art face recognition solutions proposed to incorporate a fixed penalty margin on commonly used classification loss function, softmax loss, in the normalized hypersphere to increase the discriminative power of face recognition models, by minimizing the intra-class variation and maximizing the inter-class variation. Marginal penalty softmax losses, such as ArcFace and CosFace, assume that the geodesic distance between and within the different identities can be equally learned using a fixed penalty margin. However, such a learning objective is not realistic for real data with inconsistent inter-and intra-class variation, which might limit the discriminative and generalizability of the face recognition model. In this paper, we relax the fixed penalty margin constrain by proposing elastic penalty margin loss (ElasticFace) that allows flexibility in the push for class separability. The main idea is to utilize random margin values drawn from a normal distribution in each training iteration. This aims at giving the decision boundary chances to extract and retract to allow space for flexible class separability learning. We demonstrate the superiority of our ElasticFace loss over ArcFace and CosFace losses, using the same geometric transformation, on a large set of mainstream benchmarks. From a wider perspective, our ElasticFace has advanced the state-of-the-art face recognition performance on seven out of nine mainstream benchmarks.
翻译:学习歧视面貌特征在建立高性能面部识别模型方面起着重要作用。最近提出的最先进的面部识别解决方案建议,在常用分类损失功能和内部差异不一致的真实数据中加入固定的罚款差值,即软Max损失,在正常的超视距中增加面部识别模型的歧视性力量,最大限度地减少阶级内部差异和最大限度地扩大阶级间差异。ArcFace和CosFace等边际惩罚软体损失假设,不同身份之间和内部的大地差值可以使用固定的罚款差值来同等学习。然而,这种学习目标对于使用不同等级之间和阶级内部差异不一致的真实数据来说是不现实的,这可能会限制面部识别模型的歧视性和可概括性。在本文中,我们通过提出弹性罚款差值损失(弹性法力)来放松固定的罚款差值限制,从而允许灵活推动阶级间隙差。主要想法是利用每次培训的正常面部分布得出的随机差值。 这样做的目的是让决策界有机会提取和回回空间,以便从弹性的等级和主流标准上获得弹性的弹性的等级变换损。我们等级的等级标准标准,我们学习了相同的标准。