This paper studies how well generative adversarial networks (GANs) learn probability distributions from finite samples. Our main results estimate the convergence rates of GANs under a collection of integral probability metrics defined through H\"older classes, including the Wasserstein distance as a special case. We also show that GANs are able to adaptively learn data distributions with low-dimensional structure or have H\"older densities, when the network architectures are chosen properly. In particular, for distributions concentrate around a low-dimensional set, it is proved that the learning rates of GANs do not depend on the high ambient dimension, but on the lower intrinsic dimension. Our analysis is based on a new oracle inequality decomposing the estimation error into generator and discriminator approximation error and statistical error, which may be of independent interest.
翻译:本文研究基因对抗网络(GANs)如何从有限的样本中很好地学习概率分布。 我们的主要结果估算了GANs在通过H\"老类"定义的综合概率度量的集合下的总合率,包括作为特例的瓦瑟斯坦距离。 我们还表明,当网络结构被正确选择时,GANs能够适应性地学习低维结构的数据分布,或者有H\'older密度。特别是,对于集中在低维集的分布,可以证明GANs的学习率并不取决于高环境维度,而是取决于较低的内在维度。我们的分析基于一个新的甲骨不平等,它把估计错误分解成生成器,而歧视近似错误和统计错误,这可能具有独立的兴趣。