In this work, we investigate the problem of face reconstruction given a facial feature representation extracted from a blackbox face recognition engine. Indeed, it is very challenging problem in practice due to the limitations of abstracted information from the engine. We therefore introduce a new method named Attention-based Bijective Generative Adversarial Networks in a Distillation framework (DAB-GAN) to synthesize faces of a subject given his/her extracted face recognition features. Given any unconstrained unseen facial features of a subject, the DAB-GAN can reconstruct his/her faces in high definition. The DAB-GAN method includes a novel attention-based generative structure with the new defined Bijective Metrics Learning approach. The framework starts by introducing a bijective metric so that the distance measurement and metric learning process can be directly adopted in image domain for an image reconstruction task. The information from the blackbox face recognition engine will be optimally exploited using the global distillation process. Then an attention-based generator is presented for a highly robust generator to synthesize realistic faces with ID preservation. We have evaluated our method on the challenging face recognition databases, i.e. CelebA, LFW, AgeDB, CFP-FP, and consistently achieved the state-of-the-art results. The advancement of DAB-GAN is also proven on both image realism and ID preservation properties.
翻译:在这项工作中,我们调查面部重建问题,其面部特征代表来自黑盒面部识别引擎。事实上,由于引擎的抽象信息有限,这是一个非常具有挑战性的实际问题。因此,我们引入了一种新的方法,名为“基于注意的双向创生反反反向网络”在蒸馏框架中(DAB-GAN),以综合一个被提取面部识别特征的主体的面部特征。鉴于一个对象的任何不受限制的无形面部特征,DAB-GAN可以高定义地重建他/她的面部。DAB-GAN方法包括一个新的基于关注的基因结构,采用新定义的“定向计量学习”方法。这个框架首先采用了一种双向指标,以便在图像重建任务图像域中直接采用远程测量和计量学习进程。黑盒面部面部识别引擎的信息将最佳地利用全球蒸馏过程加以利用。然后提出一个基于关注的生成器,以将他/她的面部面部面部与身份保护结合起来。我们评估了我们在具有挑战性面部识别数据库上采用的方法,i.ab-Ab-AFDB, 以及C-A-ADFA、CADBA、CA、CAA、CA、CAA、A、AAAAA、AAAAADAAA、AAAAA、AAAAAAAAAAAAA、A、AAA、AAAAA、AAAAAAAAAAAA、CAAAAAAAA、CAAAAAAAAAAA、CAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA