Topological deep learning is a rapidly growing field that pertains to the development of deep learning models for data supported on topological domains such as simplicial complexes, cell complexes, and hypergraphs, which generalize many domains encountered in scientific computations. In this paper, we present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains. Specifically, we first introduce combinatorial complexes, a novel type of topological domain. Combinatorial complexes can be seen as generalizations of graphs that maintain certain desirable properties. Similar to hypergraphs, combinatorial complexes impose no constraints on the set of relations. In addition, combinatorial complexes permit the construction of hierarchical higher-order relations, analogous to those found in simplicial and cell complexes. Thus, combinatorial complexes generalize and combine useful traits of both hypergraphs and cell complexes, which have emerged as two promising abstractions that facilitate the generalization of graph neural networks to topological spaces. Second, building upon combinatorial complexes and their rich combinatorial and algebraic structure, we develop a general class of message-passing combinatorial complex neural networks (CCNNs), focusing primarily on attention-based CCNNs. We characterize permutation and orientation equivariances of CCNNs, and discuss pooling and unpooling operations within CCNNs in detail. Third, we evaluate the performance of CCNNs on tasks related to mesh shape analysis and graph learning. Our experiments demonstrate that CCNNs have competitive performance as compared to state-of-the-art deep learning models specifically tailored to the same tasks. Our findings demonstrate the advantages of incorporating higher-order relations into deep learning models in different applications.
翻译:深度拓扑学习是一个快速发展的领域,涉及开发基于拓扑域的深度学习模型,例如单纯复形、细胞复形和超图,这些都是广泛应用于科学计算中的域。本文提出了一个基于更丰富的数据结构的统一深度学习框架,其中包括广泛采用的拓扑域。具体来说,我们首先介绍组合复形,一种新的拓扑域类型。组合复形可以看作是维持某些理想特性的图的泛化。类似于超图,组合复形对关系集合没有约束。此外,组合复形允许构造分层高阶关系,类似于单纯复形和细胞复形中发现的那些关系。因此,组合复形泛化并结合了超图和细胞复形的有用特征,一度被认为是促进将图神经网络推广到拓扑空间的两种有前途的抽象。其次,基于组合复形及其丰富的组合和代数结构,我们开发了一类通用的消息传递组合复形神经网络(CCNNs),重点关注基于注意力的CCNNs。我们表征了CCNNs的置换和方向等变性,并详细讨论了CCNNs内的池化和反池化操作。第三,我们在网格形状分析和图形学习相关任务中评估了CCNNs的性能。我们的实验证明,与专门针对这些任务设计的最先进深度学习模型相比,CCNNs具有竞争力的性能。我们的研究结果表明,在不同应用中,将高阶关系结合到深度学习模型中具有显著优势。