Many problems in computer vision and machine learning can be cast as learning on hypergraphs that represent higher-order relations. Recent approaches for hypergraph learning extend graph neural networks based on message passing, which is simple yet fundamentally limited in modeling long-range dependencies and expressive power. On the other hand, tensor-based equivariant neural networks enjoy maximal expressiveness, but their application has been limited in hypergraphs due to heavy computation and strict assumptions on fixed-order hyperedges. We resolve these problems and present Equivariant Hypergraph Neural Network (EHNN), the first attempt to realize maximally expressive equivariant layers for general hypergraph learning. We also present two practical realizations of our framework based on hypernetworks (EHNN-MLP) and self-attention (EHNN-Transformer), which are easy to implement and theoretically more expressive than most message passing approaches. We demonstrate their capability in a range of hypergraph learning problems, including synthetic k-edge identification, semi-supervised classification, and visual keypoint matching, and report improved performances over strong message passing baselines. Our implementation is available at https://github.com/jw9730/ehnn.
翻译:计算机视觉和机器学习方面的许多问题可以被作为高端关系高端关系高空学的学习。最近的高空学习方法以传递信息为基础,扩展图像神经网络,这在建模远程依赖性和表达力方面是简单但根本有限的。另一方面,基于等离异的神经网络具有最大表达力,但在高压计算和对定购高端的严格假设中,其应用有限。我们解决这些问题和当前的“QQQevariant Hygraph神经网络 ” (EHNNN),这是首次尝试实现信息传递的最大表达式等异质层,用于一般高空学习。我们还介绍了基于超网络(ENN-MLP)和自用(EHNN-T-Transerfornect)的两套框架的实际实现情况,这些框架易于执行,而且从理论上讲也比大多数传递信息的方法更清晰。我们展示了它们在一系列高端学习问题方面的能力,包括合成K-直方识别、半超级分类和直观关键点匹配,以及报告在强有力信息传递基线上改进的绩效。我们可在 http http://wwws/comgiews。