In this paper, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale graph learning tasks. Armed with the Luo-Tseng error bound condition~\citep{luo1992error}, two proposed algorithms, called Bregman Alternating Projected Gradient (BAPG) and hybrid Bregman Proximal Gradient (hBPG) enjoy the convergence guarantees. Upon task-specific properties, our analysis further provides novel theoretical insights to guide how to select the best-fit method. As a result, we are able to provide comprehensive experiments to validate the effectiveness of our methods on a host of tasks, including graph alignment, graph partition, and shape matching. In terms of both wall-clock time and modeling performance, the proposed methods achieve state-of-the-art results.
翻译:在本文中,我们研究为计算格罗莫夫-瓦瑟斯坦(GW)距离而专门设计的一套高效算法的设计和分析,以适应大型图表学习任务。根据Loo-Tsen误差约束条件 ⁇ citep{luo1992error},两种拟议的算法,即Bregman变换预测梯度(BAPG)和混合布雷格曼Proximal梯度(HBPG)享有趋同保证。根据任务特性,我们的分析进一步提供了新的理论见解,以指导如何选择最合适的方法。结果,我们能够提供综合实验,验证我们方法在一系列任务上的有效性,包括图形对齐、图形分割和形状匹配。在时钟和建模效果方面,拟议方法都取得了最新的结果。