Comparing structured data from possibly different metric-measure spaces is a fundamental task in machine learning, with applications in, e.g., graph classification. The Gromov-Wasserstein (GW) discrepancy formulates a coupling between the structured data based on optimal transportation, tackling the incomparability between different structures by aligning the intra-relational geometries. Although efficient \emph{local} solvers such as conditional gradient and Sinkhorn are available, the inherent non-convexity still prevents a tractable evaluation, and the existing lower bounds are not tight enough for practical use. To address this issue, we take inspiration from the connection with the quadratic assignment problem, and propose the orthogonal Gromov-Wasserstein (OGW) discrepancy as a surrogate of GW. It admits an efficient and \emph{closed-form} lower bound with $\mathcal{O}(n^3)$ complexity, and directly extends to the fused Gromov-Wasserstein (FGW) distance, incorporating node features into the coupling. Extensive experiments on both the synthetic and real-world datasets show the tightness of our lower bounds, and both OGW and its lower bounds efficiently deliver accurate predictions and satisfactory barycenters for graph sets.
翻译:比较来自可能不同计量空间的结构化数据是机器学习的一项基本任务,其应用如图解分类。Gromov-Wasserstein(GW)差异在基于最佳运输的结构化数据之间形成一种混合,通过调整关系内部的地形,解决不同结构之间的不可比性。虽然有条件梯度和Sinkhorn等高效的计算器可以使用,但固有的非兼容性仍然阻碍着可移植的评价,而现有的下边线不够紧,无法实际使用。为了解决这个问题,我们从四边分配问题的连接中汲取灵感,并提议将Ophonal Gromov-Wasserstein(OGW)差异作为GW的代名。它承认一种高效和隐蔽的连接较低与$mathcal{O}(n3)美元的复杂性,并直接延伸到已整合的Gromov-Wasserstein(FGGW)距离,将节点特性纳入组合中,并提议将OGW(OG)的正下层精确度和下图层的精确性实验显示其精确性。