Generative Adversarial Networks (GANs) have achieved a great success in unsupervised learning. Despite its remarkable empirical performance, there are limited theoretical studies on the statistical properties of GANs. This paper provides approximation and statistical guarantees of GANs for the estimation of data distributions that have densities in a H\"{o}lder space. Our main result shows that, if the generator and discriminator network architectures are properly chosen, GANs are consistent estimators of data distributions under strong discrepancy metrics, such as the Wasserstein-1 distance. Furthermore, when the data distribution exhibits low-dimensional structures, we show that GANs are capable of capturing the unknown low-dimensional structures in data and enjoy a fast statistical convergence, which is free of curse of the ambient dimensionality. Our analysis for low-dimensional data builds upon a universal approximation theory of neural networks with Lipschitz continuity guarantees, which may be of independent interest.
翻译:创世网络(GANs)在未受监督的学习中取得了巨大成功。 尽管它取得了显著的经验性表现,但关于GANs统计特性的理论研究有限。本文件提供了GANs的近似和统计保障,以估计H\"{o}lder空间密度的数据分布。我们的主要结果表明,如果生成器和区分器网络结构得到正确选择,GANs是数据分布的一致估计者,而数据分布的衡量标准则有很强的差异,例如瓦塞斯坦-1距离。此外,当数据发布显示数据显示数据结构低维度时,我们表明GANs能够捕捉到数据中未知的低维结构,并享有快速的统计趋同,而这些数据没有环境维度的诅咒。我们对低维数据的分析建立在一个通用的近光线网络理论的基础上,而Lipschitz的连续性保障可能具有独立的兴趣。