Comparing structured objects such as graphs is a fundamental operation involved in many learning tasks. To this end, the Gromov-Wasserstein (GW) distance, based on Optimal Transport (OT), has proven to be successful in handling the specific nature of the associated objects. More specifically, through the nodes connectivity relations, GW operates on graphs, seen as probability measures over specific spaces. At the core of OT is the idea of conservation of mass, which imposes a coupling between all the nodes from the two considered graphs. We argue in this paper that this property can be detrimental for tasks such as graph dictionary or partition learning, and we relax it by proposing a new semi-relaxed Gromov-Wasserstein divergence. Aside from immediate computational benefits, we discuss its properties, and show that it can lead to an efficient graph dictionary learning algorithm. We empirically demonstrate its relevance for complex tasks on graphs such as partitioning, clustering and completion.
翻译:对比图形等结构化对象是许多学习任务的基本操作。 为此,基于最佳运输(OT)的Gromov-Wasserstein(GW)距离已证明能够成功处理相关对象的具体性质。 更具体地说,通过节点连接关系,GW在图形上运行,被视为特定空间的概率度量。 OT的核心是保护质量的理念,它把两个考虑过的图表的所有节点结合起来。 我们在本文件中争论说,这种属性可能对图形词典或分区学习等任务有害,我们通过提出一个新的半松散的Gromov-Wasserstein差异来放松它。 除了直接的计算效益之外,我们讨论其属性,并表明它能够导致一个高效的图形字典学习算法。 我们从经验上证明它与分区、组合和完成等图表上的复杂任务的相关性。