In this paper, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale graph learning tasks. Armed with the Luo-Tseng error bound condition~\cite{luo1992error}, two proposed algorithms, called Bregman Alternating Projected Gradient (BAPG) and hybrid Bregman Proximal Gradient (hBPG) are proven to be (linearly) convergent. Upon task-specific properties, our analysis further provides novel theoretical insights to guide how to select the best fit method. As a result, we are able to provide comprehensive experiments to validate the effectiveness of our methods on a host of tasks, including graph alignment, graph partition, and shape matching. In terms of both wall-clock time and modeling performance, the proposed methods achieve state-of-the-art results.
翻译:在本文中,我们研究为计算格罗莫夫-瓦瑟斯坦(GW)距离而专门设计的一套高效算法的设计和分析,这些算法是针对大型图形学习任务的。配有Loo-Tsen错误约束条件 ⁇ cite{luo1992error}的两种拟议算法,称为Bregman变换预测梯度(BAPG)和混合Bregman Proximal梯度(HBPG)的(线性)被证明是(线性)趋同的。根据任务特性,我们的分析进一步提供了新的理论见解,以指导如何选择最合适的方法。因此,我们能够提供全面的实验,以验证我们方法在一系列任务上的有效性,包括图形对齐、图形分割和形状匹配。在墙时和模型性能方面,拟议方法都取得了最新的结果。