Wasserstein gradient flows on probability measures have found a host of applications in various optimization problems. They typically arise as the continuum limit of exchangeable particle systems evolving by some mean-field interaction involving a gradient-type potential. However, in many problems, such as in multi-layer neural networks, the so-called particles are edge weights on large graphs whose nodes are exchangeable. Such large graphs are known to converge to continuum limits called graphons as their size grow to infinity. We show that the Euclidean gradient flow of a suitable function of the edge-weights converges to a novel continuum limit given by a curve on the space of graphons that can be appropriately described as a gradient flow or, more technically, a curve of maximal slope. Several natural functions on graphons, such as homomorphism functions and the scalar entropy, are covered by our set-up, and the examples have been worked out in detail.
翻译:瓦塞斯坦梯度在概率测量上发现许多应用在各种优化问题中,这些应用通常会随着一些具有梯度潜力的中位场相互作用而演变而形成的可交换粒子系统的连续性限制而产生。然而,在诸如多层神经网络等许多问题中,所谓的粒子是大图的边缘权重,其节点是可以互换的。据知,这些大图在大小向无限度增长时会汇合为称为图形的连续限。我们显示,边缘重量适当函数的欧几里德梯度流会汇合到由图形空间的曲线给出的新的连续限,该曲线可以被恰当地描述为梯度流,或更技术上说,是最大斜度的曲线。一些图形上的自然函数,如同性函数和斜度环曲等,由我们的设置所覆盖,这些示例已经详细研究过。