This paper presents a novel convolutional layer, called perturbed convolution (PConv), which focuses on achieving two goals simultaneously: improving the generative adversarial network (GAN) performance and alleviating the memorization problem in which the discriminator memorizes all images from a given dataset as training progresses. In PConv, perturbed features are generated by randomly disturbing an input tensor before performing the convolution operation. This approach is simple but surprisingly effective. First, to produce a similar output even with the perturbed tensor, each layer in the discriminator should learn robust features having a small local Lipschitz value. Second, since the input tensor is randomly perturbed during the training procedure like the dropout in neural networks, the memorization problem could be alleviated. To show the generalization ability of the proposed method, we conducted extensive experiments with various loss functions and datasets including CIFAR-10, CelebA, CelebA-HQ, LSUN, and tiny-ImageNet. The quantitative evaluations demonstrate that PConv effectively boosts the performance of GAN and conditional GAN in terms of Frechet inception distance (FID).
翻译:本文展示了一个新型的革命层,称为动荡共变(PConv),它侧重于同时实现两个目标:改善基因对抗网络(GAN)的功能,缓解歧视者在培训进展过程中对特定数据集的所有图像进行记忆的记忆问题。在PConv, 扰动特征是随机扰动的输入节奏产生的,这是简单但令人惊讶的有效方法。首先,即使与被扰动的振动器产生类似的产出,歧视者中的每一层都应学习具有当地小 Lipschitz价值的强健特征。第二,由于在培训过程中,如神经网络的辍学,输入振动器随机地渗透,记忆问题是可以缓解的。为了显示拟议方法的概括性能力,我们对各种损失功能和数据集进行了广泛的实验,包括CIFAR-10、CelebA、CeebebA-HQ、CelebA-HQ、LSUN和小IMageNet。定量评估表明,PConv 有效提升了GAN和条件GAN-AN在Fchetegest的距离上的表现。