Due to the outstanding capability for data generation, Generative Adversarial Networks (GANs) have attracted considerable attention in unsupervised learning. However, training GANs is difficult, since the training distribution is dynamic for the discriminator, leading to unstable image representation. In this paper, we address the problem of training GANs from a novel perspective, \emph{i.e.,} robust image classification. Motivated by studies on robust image representation, we propose a simple yet effective module, namely AdaptiveMix, for GANs, which shrinks the regions of training data in the image representation space of the discriminator. Considering it is intractable to directly bound feature space, we propose to construct hard samples and narrow down the feature distance between hard and easy samples. The hard samples are constructed by mixing a pair of training images. We evaluate the effectiveness of our AdaptiveMix with widely-used and state-of-the-art GAN architectures. The evaluation results demonstrate that our AdaptiveMix can facilitate the training of GANs and effectively improve the image quality of generated samples. We also show that our AdaptiveMix can be further applied to image classification and Out-Of-Distribution (OOD) detection tasks, by equipping it with state-of-the-art methods. Extensive experiments on seven publicly available datasets show that our method effectively boosts the performance of baselines. The code is publicly available at https://github.com/WentianZhang-ML/AdaptiveMix.
翻译:由于数据生成的出色能力,Genemental Adversarial Networks(GANs)在不受监督的学习中吸引了相当多的关注,然而,培训GANs是困难的,因为培训分布对歧视者来说是动态的,导致图像代表的不稳定。在本文中,我们从新颖的角度,即\emph{i.e.e.}强健的图像分类,解决了培训GANs的问题。在对稳健的图像表示的研究的推动下,我们为GANs提出了一个简单而有效的模块,即适应Mix,它收缩了歧视者图像代表空间的培训数据区域。考虑到直接约束功能空间是棘手的,因此我们建议建造硬样品,缩小硬样品和简单样品之间的特征距离。在本文中,我们用一对一对培训图像进行混合来构建硬样品。我们以广泛使用和最先进的GAN结构来评估我们的适应组合的有效性。我们的适应Mix可以促进GANs的培训,并有效地改进所生成的样本的图像质量。我们还展示了我们的调整MLO-M-mainal-dal-dalal-dal laction-dal-dal-dal-dal-laveal-dal-d-dal-d-d-lavedal-d-laved-d-d-d-d-d-lad-d-d-d-d-d-dald-d-d-d-d-d-d-dald-d-d-d-d-d-lad-d-d-d-d-d-d-d-d-lad-d-d-d-d-d-d-d-d-d-lad-d-lad-lad-d-lad-lad-lad-d-d-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-lad-d-d-lad-d-d-d-d-d-</s>