Unpaired image-to-image (I2I) translation has received considerable attention in pattern recognition and computer vision because of recent advancements in generative adversarial networks (GANs). However, due to the lack of explicit supervision, unpaired I2I models often fail to generate realistic images, especially in challenging datasets with different backgrounds and poses. Hence, stabilization is indispensable for GANs and applications of I2I translation. Herein, we propose Augmented Cyclic Consistency Regularization (ACCR), a novel regularization method for unpaired I2I translation. Our main idea is to enforce consistency regularization originating from semi-supervised learning on the discriminators leveraging real, fake, reconstructed, and augmented samples. We regularize the discriminators to output similar predictions when fed pairs of original and perturbed images. We qualitatively clarify why consistency regularization on fake and reconstructed samples works well. Quantitatively, our method outperforms the consistency regularized GAN (CR-GAN) in real-world translations and demonstrates efficacy against several data augmentation variants and cycle-consistent constraints.
翻译:由于基因对抗网络(GANs)最近的进步,未受尊重的I2I模型往往无法产生现实的图像,特别是在具有不同背景和面貌的数据集具有挑战性的情况下。因此,对于GAN和I2I翻译的应用来说,稳定是不可或缺的。在这里,我们提议加强Cyclic Consistance Recionalization(ACCR),这是对未受尊重的I2I翻译的一种新颖的规范化方法。我们的主要想法是,通过利用真实、假、重建和强化样本,对歧视者进行半监督性学习,从而实现一致性规范化。我们使歧视者在配制原始和周遭图像时能够产生类似的预测。我们从质量上澄清为什么对伪造和再版的样品的一致性规范化效果良好。从数量上看,我们的方法超过了在现实世界翻译中实现一致性化的规范化GAN(CR-GAN),并展示了对抗多种数据增强变式和循环一致限制的效力。