We introduce the framework of continuous--depth graph neural networks (GNNs). Graph neural ordinary differential equations (GDEs) are formalized as the counterpart to GNNs where the input-output relationship is determined by a continuum of GNN layers, blending discrete topological structures and differential equations. The proposed framework is shown to be compatible with various static and autoregressive GNN models. Results prove general effectiveness of GDEs: in static settings they offer computational advantages by incorporating numerical methods in their forward pass; in dynamic settings, on the other hand, they are shown to improve performance by exploiting the geometry of the underlying dynamics.
翻译:我们引入了连续深度图形神经网络(GNNs)框架。图形神经普通差异方程式(GDEs)正式成为GNNs的对应方,其输入-产出关系由GNN的连续层决定,混合离散的地形结构和差异方程式。拟议框架与各种静态和自动反向的GNN模型相兼容。结果证明GDEs的总体有效性:在静态环境中,它们通过在前传中采用数字方法提供计算优势;另一方面,在动态环境中,它们显示通过利用基本动态的几何方法改进性能。