Generative adversarial networks (GANs) have enjoyed tremendous empirical successes, and research interest in the theoretical understanding of GANs training process is rapidly growing, especially for its evolution and convergence analysis. This paper establishes approximations, with precise error bound analysis, for the training of GANs under stochastic gradient algorithms (SGAs). The approximations are in the form of coupled stochastic differential equations (SDEs). The analysis of the SDEs and the associated invariant measures yields conditions for the convergence of GANs training. Further analysis of the invariant measure for the coupled SDEs gives rise to a fluctuation-dissipation relations (FDRs) for GANs, revealing the trade-off of the loss landscape between the generator and the discriminator and providing guidance for learning rate scheduling.
翻译:产生对抗网络(GANs)取得了巨大的实证成功,研究对GANs培训过程的理论理解的兴趣正在迅速增长,特别是对其演变和趋同分析的兴趣正在迅速增长,本文件通过精确的误差约束分析,为在随机梯度算法(SGAs)下培训GANs确定了近似值,揭示了产生者与歧视者之间损失情况的权衡,并为学习进度安排提供了指导。