Stability is an important property of graph neural networks (GNNs) which explains their success in many problems of practical interest. Existing GNN stability results depend on the size of the graph, restricting applicability to graphs of moderate size. To understand the stability properties of GNNs on large graphs, we consider neural networks supported on manifolds. These are defined in terms of manifold diffusions mediated by the Laplace-Beltrami (LB) operator and are interpreted as limits of GNNs running on graphs of growing size. We define manifold deformations and show that they lead to perturbations of the manifold's LB operator that consist of an absolute and a relative perturbation term. We then define filters that split the infinite dimensional spectrum of the LB operator in finite partitions, and prove that manifold neural networks (MNNs) with these filters are stable to both, absolute and relative perturbations of the LB operator. Stability results are illustrated numerically in resource allocation problems in wireless networks.
翻译:稳定性是图形神经网络的重要属性,它解释其在许多实际问题中的成功。现有的GNN稳定性结果取决于图形的大小,限制对中小图的可适用性。为了理解大图中GNNs的稳定性特性,我们认为神经网络支持于多个元体。这些神经网络的定义是由Laplace-Beltrami(LB)操作员调解的多重扩散,并被解释为在不断增大的图形上运行的GNNs的极限。我们定义了多重变形,并表明它们导致由绝对和相对扰动术语组成的MutoLB操作员的LB操作员受到干扰。我们随后定义过滤器,将LB操作员在有限分区中的无限维谱进行分割,并证明带有这些过滤器的多重神经网络(MNN)稳定于LB操作员的绝对和相对扰动。稳定的结果在无线网络的资源分配问题中以数字表示。