Generative Adversarial Networks (GANs) have shown compelling results in various tasks and applications in recent years. However, mode collapse remains a critical problem in GANs. In this paper, we propose a novel training pipeline to address the mode collapse issue of GANs. Different from existing methods, we propose to generalize the discriminator as feature embedding and maximize the entropy of distributions in the embedding space learned by the discriminator. Specifically, two regularization terms, i.e., Deep Local Linear Embedding (DLLE) and Deep Isometric feature Mapping (DIsoMap), are designed to encourage the discriminator to learn the structural information embedded in the data, such that the embedding space learned by the discriminator can be well-formed. Based on the well-learned embedding space supported by the discriminator, a non-parametric entropy estimator is designed to efficiently maximize the entropy of embedding vectors, playing as an approximation of maximizing the entropy of the generated distribution. By improving the discriminator and maximizing the distance of the most similar samples in the embedding space, our pipeline effectively reduces the mode collapse without sacrificing the quality of generated samples. Extensive experimental results show the effectiveness of our method, which outperforms the GAN baseline, MaF-GAN on CelebA (9.13 vs. 12.43 in FID) and surpasses the recent state-of-the-art energy-based model on the ANIME-FACE dataset (2.80 vs. 2.26 in Inception score). The code is available at https://github.com/HaozheLiu-ST/MEE
翻译:最近几年来,在各种任务和应用中,模式崩溃仍然是GANs的一个严重问题。在本文中,我们提议建立一个新的培训管道,以解决GANs模式崩溃问题。与现有方法不同,我们提议将歧视者推广为在歧视者所学到的嵌入空间中传播的特性嵌入和最大限度地增加分布的元素。具体地说,两个正规化条件,即深地方线性线性嵌入(DLLLE)和深离子特征映射(DIsoMap),旨在鼓励歧视者学习数据中嵌入的结构信息。在本文件中,我们提议建立一个新的培训管道解决GANs模式的崩溃问题。基于由歧视者所支持的深得的嵌入空间,一个非分辨性昆虫抑制器的设计目的是有效地最大限度地扩大嵌入模式矢量的酶,以此为最佳地优化所制成的发行的 Reqoality Oral 。通过改进歧质和最大限度地扩大AAN-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-MA-A-A-A-A-A-A-A-A-MA-A-A-A-A-A-MA-A-A-A-A-A-A-MA-A-A-A-A-A-MA-A-A-A-A-A-A-A-A-A-MA-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-MA-MA-A-A-A-MA-MA-MA-MA-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-MA-A-A-A-MA-A