Generative Adversarial Networks are a popular method for learning distributions from data by modeling the target distribution as a function of a known distribution. The function, often referred to as the generator, is optimized to minimize a chosen distance measure between the generated and target distributions. One commonly used measure for this purpose is the Wasserstein distance. However, Wasserstein distance is hard to compute and optimize, and in practice entropic regularization techniques are used to improve numerical convergence. The influence of regularization on the learned solution, however, remains not well-understood. In this paper, we study how several popular entropic regularizations of Wasserstein distance impact the solution in a simple benchmark setting where the generator is linear and the target distribution is high-dimensional Gaussian. We show that entropy regularization promotes the solution sparsification, while replacing the Wasserstein distance with the Sinkhorn divergence recovers the unregularized solution. Both regularization techniques remove the curse of dimensionality suffered by Wasserstein distance. We show that the optimal generator can be learned to accuracy $\epsilon$ with $O(1/\epsilon^2)$ samples from the target distribution. We thus conclude that these regularization techniques can improve the quality of the generator learned from empirical data for a large class of distributions.
翻译:生成Adversarial Networks 是一种常用的方法,通过模拟目标分布作为已知分布的函数,从数据中学习分布。 函数( 通常称为生成器) 被优化, 以尽量减少生成和目标分布之间的选定距离测量。 用于此目的的一个常用尺度是瓦瑟斯坦距离。 然而, 瓦瑟斯坦距离很难计算和优化, 在实践中, 使用昆士坦语调节技术来改进数字融合。 但是, 正规化对学习的解决方案的影响仍然不十分清楚。 在本文中, 我们研究瓦瑟斯坦距离的几种流行性昆士兰定律如何在简单基准设置中影响解决方案, 即生成器为线性, 目标分布为高维度。 我们显示, 加密定律有助于溶解, 同时用Sinkhorn差异取代瓦瑟斯坦距离, 恢复了不正规化的解决方案。 两种正规化技术都消除了瓦瑟斯坦距离对水度的诅咒。 我们表明, 最佳发电机可以学习如何精确使用$\ eplon$, 和 $\ epsistein assal lag the the laftal laft laft laft laft ex laft laft laft ex lagend the the messs the sake the make the makes the made the saldaldaldald lagendaldaldaldaldald lagenations lagendaldaldaldaldal lagend lagend lagendaldald exgal lagen制, lagendald exgald exgs exgalds max exgs. exgs ex s ds exaldal exaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldal ex exaldaldaldaldaldaldal ex ex ex ex ex ex exald ex ex exal exal exal ex ex ex ex exal ex