Comparing structured data from possibly different metric-measure spaces is a fundamental task in machine learning, with applications in, e.g., graph classification. The Gromov-Wasserstein (GW) discrepancy formulates a coupling between the structured data based on optimal transportation, tackling the incomparability between different structures by aligning the intra-relational geometries. Although efficient local solvers such as conditional gradient and Sinkhorn are available, the inherent non-convexity still prevents a tractable evaluation, and the existing lower bounds are not tight enough for practical use. To address this issue, we take inspiration from the connection with the quadratic assignment problem, and propose the orthogonal Gromov-Wasserstein (OGW) discrepancy as a surrogate of GW. It admits an efficient and closed-form lower bound with the complexity of $\mathcal{O}(n^3)$, and directly extends to the fused Gromov-Wasserstein (FGW) distance, incorporating node features into the coupling. Extensive experiments on both the synthetic and real-world datasets show the tightness of our lower bounds, and both OGW and its lower bounds efficiently deliver accurate predictions and satisfactory barycenters for graph sets.
翻译:Gromov-Wasserstein(GW)差异在基于最佳运输的结构化数据之间形成一种组合,通过对关系内部的地形进行校正,解决不同结构之间的不可比性。虽然有条件梯度和Sinkhorn等高效的本地溶剂可以使用,但固有的非混凝土仍然阻碍着可移植的评价,而现有的较低界限不够紧,无法实际使用。为了解决这一问题,我们从四面形分配问题的关联中汲取灵感,并提议将正方形格罗莫夫-Wasserstein(OGW)差异作为GW的替代物。它承认一种高效和封闭式的更低约束,与$mathcal{O}(n3) 等复杂程度相近,直接延伸至连接的Gromov-Wasserstein(FGW)距离,将节点特性纳入合并中。在合成和真实面图和真实面图中都显示我们更低的精确度和精确度的精确度图像集。