Gromov-Wasserstein (GW) distance is a key tool for manifold learning and cross-domain learning, allowing the comparison of distributions that do not live in the same metric space. Because of its high computational complexity, several approximate GW distances have been proposed based on entropy regularization or on slicing, and one-dimensional GW computation. In this paper, we propose a novel approach for comparing two incomparable distributions, that hinges on the idea of distributional slicing, embeddings, and on computing the closed-form Wasserstein distance between the sliced distributions. We provide a theoretical analysis of this new divergence, called distributional sliced embedding (DSE) discrepancy, and we show that it preserves several interesting properties of GW distance including rotation-invariance. We show that the embeddings involved in DSE can be efficiently learned. Finally, we provide a large set of experiments illustrating the behavior of DSE as a divergence in the context of generative modeling and in query framework.
翻译:Gromov-Wasserstein (GW) 距离是多元学习和跨域学习的关键工具,使得能够比较不居住在相同计量空间内的分布。 由于计算复杂程度高,基于对子正态或切除法以及单维度的GW计算,提出了几处近似GW距离的建议。 在本文中,我们提出了一种新颖的方法来比较两种无法比较的分布,这取决于分布切除、嵌入和计算切片分布之间的闭式瓦瑟斯坦距离的理念。我们提供了对这一新差异的理论分析,称为分布切片嵌入(DSE)差异,我们表明它保留了GW距离的几种有趣的特性,包括旋转变异性。我们表明DSE的嵌入过程是可以有效学习的。 最后,我们提供了大量实验,说明DSE的行为在基因化模型和查询框架中的差异。