Generative Adversarial Networks (GANs) are powerful Machine Learning models capable of generating fully synthetic samples of a desired phenomenon with a high resolution. Despite their success, the training process of a GAN is highly unstable and typically it is necessary to implement several accessory heuristics to the networks to reach an acceptable convergence of the model. In this paper, we introduce a novel method to analyze the convergence and stability in the training of Generative Adversarial Networks. For this purpose, we propose to decompose the objective function of the adversary min-max game defining a periodic GAN into its Fourier series. By studying the dynamics of the truncated Fourier series for the continuous Alternating Gradient Descend algorithm, we are able to approximate the real flow and to identify the main features of the convergence of the GAN. This approach is confirmed empirically by studying the training flow in a $2$-parametric GAN aiming to generate an unknown exponential distribution. As byproduct, we show that convergent orbits in GANs are small perturbations of periodic orbits so the Nash equillibria are spiral attractors. This theoretically justifies the slow and unstable training observed in GANs.
翻译:生成反向网络(GANs)是强大的机器学习模型,能够产生理想现象的完全合成样本,并具有高分辨率。尽管取得了成功,但GAN的训练过程极不稳定,而且通常需要对网络实施几种附属超常学,才能使模型达到可接受的趋同。在本文中,我们引入了一种新颖的方法来分析General反向网络培训的趋同和稳定性。为此,我们提议将对手微轴游戏的客观功能分解,该微轴游戏将一个周期性GAN定义成其Freier系列。通过研究连续的对等梯度梯度梯度梯度序列的动态,我们能够接近实际流动,并查明GAN趋同的主要特征。通过在以2美元对准的GAN网络中研究培训流动情况,以产生未知的指数分布。我们建议将GANs的趋同轨道的客观功能分解成一个小孔径角。通过研究,我们发现NASS equillibria的轨道的周期轨道是小的透度,因此观测到的Gnalalalstrualalstrations。