Previous hypergraph expansions are solely carried out on either vertex level or hyperedge level, thereby missing the symmetric nature of data co-occurrence, and resulting in information loss. To address the problem, this paper treats vertices and hyperedges equally and proposes a new hypergraph formulation named the \emph{line expansion (LE)} for hypergraphs learning. The new expansion bijectively induces a homogeneous structure from the hypergraph by treating vertex-hyperedge pairs as "line nodes". By reducing the hypergraph to a simple graph, the proposed \emph{line expansion} makes existing graph learning algorithms compatible with the higher-order structure and has been proven as a unifying framework for various hypergraph expansions. We evaluate the proposed line expansion on five hypergraph datasets, the results show that our method beats SOTA baselines by a significant margin.
翻译:先前的超图扩张仅在顶点级别或超边级别上进行,从而忽略了数据共现的对称性质,并导致信息丢失。为了解决这个问题,本文将顶点和超边等同对待,提出了一种名为超图线扩展(LE)的新超图形式进行学习。新扩展通过将顶点和超边对作为“线节点”来从超图中双射诱导出同构结构。通过将超图简化为简单图,所提出的超图线扩展使得现有的图学习算法能够兼容高阶结构,并已证明是各种超图扩张的统一框架。我们在五个超图数据集上评估了所提出的线扩展,结果显示我们的方法明显优于SOTA基线。