HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data. However, most existing convolution filters are localized and determined by the pre-defined initial hypergraph topology, neglecting to explore implicit and long-ange relations in real-world data. In this paper, we propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD), which serves as a generic plug-in-play module for improving the representational power of HGCNNs. Specifically, HERALD adaptively optimizes the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned. Furthermore, HERALD employs the self-attention mechanism to capture the non-local paired-nodes relation. Extensive experiments on various popular hypergraph datasets for node classification and graph classification tasks demonstrate that our approach obtains consistent and considerable performance enhancement, proving its effectiveness and generalization ability.
翻译:超大革命神经网络(HGCNNs)展示了在图形结构数据中保存的高顺序关系建模方面的潜力,然而,大多数现有的革命过滤器是局部的,由预先确定的初始高地地形学决定,忽视了探索现实世界数据中隐含和长期关系。在本文中,我们建议了为建立适应性高压结构而专门设计的第一种基于学习的方法,称为HypERgrAph Laplacian aDaptor(HERALD),它作为提高高端和高端之间代表力的通用插件模块。具体来说,HERALD以适应性的方式优化高端和高端之间的对称关系,从而学习了高端数据学。此外,HERALD还运用了自留机制,以捕捉非本地双向关系。关于节点分类和图形分类任务的各种流行高射数据集的广泛实验表明,我们的方法得到了一致和相当的提高性能,证明了其有效性和一般化能力。