This work studies the convergence and finite sample approximations of entropic regularized Wasserstein distances in the Hilbert space setting. Our first main result is that for Gaussian measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn divergence is {\it strictly weaker} than convergence in the exact 2-Wasserstein distance. Specifically, a sequence of centered Gaussian measures converges in the 2-Sinkhorn divergence if the corresponding covariance operators converge in the Hilbert-Schmidt norm. This is in contrast to the previous known result that a sequence of centered Gaussian measures converges in the exact 2-Wasserstein distance if and only if the covariance operators converge in the trace class norm. In the reproducing kernel Hilbert space (RKHS) setting, the {\it kernel Gaussian-Sinkhorn divergence}, which is the Sinkhorn divergence between Gaussian measures defined on an RKHS, defines a semi-metric on the set of Borel probability measures on a Polish space, given a characteristic kernel on that space. With the Hilbert-Schmidt norm convergence, we obtain {\it dimension-independent} convergence rates for finite sample approximations of the kernel Gaussian-Sinkhorn divergence, with the same order as the Maximum Mean Discrepancy. These convergence rates apply in particular to Sinkhorn divergence between Gaussian measures on Euclidean and infinite-dimensional Hilbert spaces. The sample complexity for the 2-Wasserstein distance between Gaussian measures on Euclidean space, while dimension-dependent and larger than that of the Sinkhorn divergence, is exponentially faster than the worst case scenario in the literature.
翻译:这项工作研究Hilbert 空间设置中英特位常规瓦森斯坦距离的趋同性和有限抽样近似值。 我们的第一个主要结果是,对于无限维度Hilbert 空间的高斯度测量,2辛克角差异的趋同程度与准确的2-沃瑟斯坦距离的趋同程度相比,是十分弱的。 具体地说,如果在Hilbert-Schmidt 规范中,相应的高斯标准同级差差差一致,那么,与先前已知的结果形成对照的是,高斯级测距测距测距测距测距的测距序列在无限维度的距离上,如果而且只有在同级操作员在跟踪级规范中趋同时,2辛克角差差值的趋同程度比。 高斯高斯测距测距的测距在高斯克洛德标准值上, 高斯克洛德标准值的测距比标准值比高。 高索尔克标准值的测距比高, 水平比平级差值的测距值比高。