Generative adversarial networks (GANs) have shown impressive results in both unconditional and conditional image generation. In recent literature, it is shown that pre-trained GANs, on a different dataset, can be transferred to improve the image generation from a small target data. The same, however, has not been well-studied in the case of conditional GANs (cGANs), which provides new opportunities for knowledge transfer compared to unconditional setup. In particular, the new classes may borrow knowledge from the related old classes, or share knowledge among themselves to improve the training. This motivates us to study the problem of efficient conditional GAN transfer with knowledge propagation across classes. To address this problem, we introduce a new GAN transfer method to explicitly propagate the knowledge from the old classes to the new classes. The key idea is to enforce the popularly used conditional batch normalization (BN) to learn the class-specific information of the new classes from that of the old classes, with implicit knowledge sharing among the new ones. This allows for an efficient knowledge propagation from the old classes to the new classes, with the BN parameters increasing linearly with the number of new classes. The extensive evaluation demonstrates the clear superiority of the proposed method over state-of-the-art competitors for efficient conditional GAN transfer tasks. The code will be available at: https://github.com/mshahbazi72/cGANTransfer
翻译:在无条件和有条件的图像生成方面,创世对抗网络(GANs)在无条件和有条件的图像生成方面都显示了令人印象深刻的结果。在最近的文献中显示,在不同的数据集中,预先培训的GANs可以转让,用一个小目标数据来改进图像生成;然而,在有条件的GANs(cGANs)中,也没有很好地研究过,它提供了与无条件设置相比进行知识转让的新机会;特别是,新班可以从相关的旧班级中借取知识,或相互分享知识,以改进培训。这促使我们研究通过跨班传播知识而有效有条件的GAN转让的问题。为了解决这一问题,我们采用了一种新的GAN转让方法,将知识从旧班级中明确传播到新班级。关键的想法是执行普遍使用的有条件的批次正常化(BN),从旧班级中学习新班级中的具体课程信息,在新班级之间不言明的知识共享。这使我们得以有效地从旧班级中传播知识,而新班级的BN参数随着新班级的直线性增加,新班级/Granzi/rentre rodroforal ratial ratial raft lades AN ratial ratial ratial lave lades AN AN AN laveal laveal lautal