Generative adversarial networks (GANs) have proven successful in image generation tasks. However, GAN training is inherently unstable. Although many works try to stabilize it by manually modifying GAN architecture, it requires much expertise. Neural architecture search (NAS) has become an attractive solution to search GANs automatically. The early NAS-GANs search only generators to reduce search complexity but lead to a sub-optimal GAN. Some recent works try to search both generator (G) and discriminator (D), but they suffer from the instability of GAN training. To alleviate the instability, we propose an efficient two-stage evolutionary algorithm-based NAS framework to search GANs, namely EAGAN. We decouple the search of G and D into two stages, where stage-1 searches G with a fixed D and adopts the many-to-one training strategy, and stage-2 searches D with the optimal G found in stage-1 and adopts the one-to-one training and weight-resetting strategies to enhance the stability of GAN training. Both stages use the non-dominated sorting method to produce Pareto-front architectures under multiple objectives (e.g., model size, Inception Score (IS), and Fr\'echet Inception Distance (FID)). EAGAN is applied to the unconditional image generation task and can efficiently finish the search on the CIFAR-10 dataset in 1.2 GPU days. Our searched GANs achieve competitive results (IS=8.81$\pm$0.10, FID=9.91) on the CIFAR-10 dataset and surpass prior NAS-GANs on the STL-10 dataset (IS=10.44$\pm$0.087, FID=22.18). Source code: https://github.com/marsggbo/EAGAN.
翻译:生成对抗网络(GANs)在图像生成任务中证明是成功的。然而,GAN培训本质上是不稳定的。虽然许多工作试图通过手工修改GAN结构来稳定它,但需要大量专门知识。神经结构搜索(NAS)已经成为自动搜索GAN的有吸引力的解决办法。早期NAS-GAN搜索发电机只是为了降低搜索复杂性,但导致一个亚优的GAN。最近的一些工作试图搜索发电机(G)和导师(D),但受到GAN培训不稳定的影响。为了缓解不稳定,我们提议一个高效的两阶段进化算法型NAS框架来搜索GAN结构,即EAGAN。我们把G和D的搜索分成两个阶段,在这个阶段,G-1搜索G以降低搜索复杂性,采用多级培训策略,但导致最佳的G-10GAN。最近的一些工作试图搜索发电机(GNARS) 和制价一一比的培训和重调整战略,以加强GAN培训的稳定性。我们两个阶段都使用非主化的排序方法,在OIS-OARS 和ODSOODS 上,在多个目标下,在SIS ASqal 数据中应用了S.