In this work we introduce a convolution operation over the tangent bundle of Riemann manifolds in terms of exponentials of the Connection Laplacian operator. We define tangent bundle filters and tangent bundle neural networks (TNNs) based on this convolution operation, which are novel continuous architectures operating on tangent bundle signals, i.e. vector fields over the manifolds. Tangent bundle filters admit a spectral representation that generalizes the ones of scalar manifold filters, graph filters and standard convolutional filters in continuous time. We then introduce a discretization procedure, both in the space and time domains, to make TNNs implementable, showing that their discrete counterpart is a novel principled variant of the very recently introduced sheaf neural networks. We formally prove that this discretized architecture converges to the underlying continuous TNN. Finally, we numerically evaluate the effectiveness of the proposed architecture on various learning tasks, both on synthetic and real data.
翻译:在这项工作中,我们引入了一种基于复合拉普拉斯算子的指数函数的流形切空间上的卷积运算。我们定义了基于这种卷积运算的切空间滤波器和切空间神经网络(TNN),它们是操作于流形上的切空间信号(即流形上的向量场)的连续新型体系结构。切空间滤波器具有谱表示,其推广了标量流形滤波器、图滤波器和标准卷积滤波器在连续时间中的表示。然后,我们引入了一种离散化过程,包括空间和时间域,以使TNNs可实现,证明了它们的离散化版本是极新的基于底扎纹神经网络的学习结构。我们正式证明了该离散化体系结构会收敛于其底层的连续TNN。最后,我们对各种合成数据和真实数据上提出的结构进行了数值评估。