Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models, which have attracted great attention in various applications. However, this framework has two main drawbacks: (i) Wasserstein-1 (or Earth-Mover) distance is restrictive such that WGANs cannot always fit data geometry well; (ii) It is difficult to achieve fast training of WGANs. In this paper, we propose a new class of \textit{Relaxed Wasserstein} (RW) distances by generalizing Wasserstein-1 distance with Bregman cost functions. We show that RW distances achieve nice statistical properties while not sacrificing the computational tractability. Combined with the GANs framework, we develop Relaxed WGANs (RWGANs) which are not only statistically flexible but can be approximated efficiently using heuristic approaches. Experiments on real images demonstrate that the RWGAN with Kullback-Leibler (KL) cost function outperforms other competing approaches, e.g., WGANs, even with gradient penalty.
翻译:瓦森斯坦-格外生义网络(Wasserstein General Aversarial Networks)(WGANs)提供多种模型,在各种应用中引起极大注意,但这一框架有两个主要缺陷:(一) 瓦森斯坦-1(或地球-Mover)距离限制,使WGANs无法始终符合数据几何学;(二) 难以对WGANs进行快速培训。在本文中,我们建议采用新的一类\ textit{Lislaxed Wasserstein}(RW)距离,将Wasserstein-1距离与Bregman成本功能相通。我们表明,Wasserf距离具有良好的统计特性,同时又不牺牲计算可移动性。我们与GANs框架一道,我们开发了放松的WGANs(RWGANs),这些不仅在统计上灵活,而且使用超效法。在实际图像上的实验表明,使用Kullback-Leper (KL) 的成本功能超越了其他竞争性方法,例如WGANs,甚至带有梯度罚款。