Recent works in Generative Adversarial Networks (GANs) are actively revisiting various data augmentation techniques as an effective way to prevent discriminator overfitting. It is still unclear, however, that which augmentations could actually improve GANs, and in particular, how to apply a wider range of augmentations in training. In this paper, we propose a novel way to address these questions by incorporating a recent contrastive representation learning scheme into the GAN discriminator, coined ContraD. This "fusion" enables the discriminators to work with much stronger augmentations without increasing their training instability, thereby preventing the discriminator overfitting issue in GANs more effectively. Even better, we observe that the contrastive learning itself also benefits from our GAN training, i.e., by maintaining discriminative features between real and fake samples, suggesting a strong coherence between the two worlds: good contrastive representations are also good for GAN discriminators, and vice versa. Our experimental results show that GANs with ContraD consistently improve FID and IS compared to other recent techniques incorporating data augmentations, still maintaining highly discriminative features in the discriminator in terms of the linear evaluation. Finally, as a byproduct, we also show that our GANs trained in an unsupervised manner (without labels) can induce many conditional generative models via a simple latent sampling, leveraging the learned features of ContraD. Code is available at https://github.com/jh-jeong/ContraD.
翻译:在General Adversarial Network(GANs)中,最近的一些工作正在积极重新研究各种数据增强技术,作为防止歧视过度的有效方法。然而,目前还不清楚的是,这些增强技术实际上能够实际改进GANs,特别是如何在培训中应用更广泛的增强能力。在本文件中,我们提出了一种解决这些问题的新办法,将最近的对比代表性学习计划纳入GAN 歧视者(Connational ContraD ) 。这种“融合”使歧视者能够在不增加培训不稳定的情况下,以更强大的增强的增强能力开展工作,从而防止GANs中的歧视者过多的问题。更好的是,我们发现对比学习本身也能从我们的GAN培训中获益,即通过在真实和假样本之间保持区别性特征,表明两个世界之间的紧密一致性:良好的对比性表述对于GAN 歧视者也是很好的,反之亦然相反。我们的实验结果表明,GAN 与 ContraD 一致的不断改善FID和IS 与其他最新技术相比, 仍然保持高差别性强的GAN,我们在简单的G-ralimalalalalal exalalalal assupal ex ex ex ex ex exupal shal sh sh sh shing a shing a shown a ex a an exed a exed a an ex ex ex ex ex ex ex ex ex ex lautilate a lautilding a lade ex ex lave lading a ladementmentald a ex a ladementaldmentaldmentaldmentald a lades a lades a lad a lades a lad a ladementaldaldaldaldalds a lad a lad a lad a lad a lad a ladald a lad a lad a ex a lad a lad a lad a lad a lad a lad a ex a lad a lad a lad a lad a lad a lad a lad a lad a