Learning on high-order correlation has shown superiority in data representation learning, where hypergraph has been widely used in recent decades. The performance of hypergraph-based representation learning methods, such as hypergraph neural networks, highly depends on the quality of the hypergraph structure. How to generate the hypergraph structure among data is still a challenging task. Missing and noisy data may lead to "bad connections" in the hypergraph structure and destroy the hypergraph-based representation learning process. Therefore, revealing the high-order structure, i.e., the hypergraph behind the observed data, becomes an urgent but important task. To address this issue, we design a general paradigm of deep hypergraph structure learning, namely DeepHGSL, to optimize the hypergraph structure for hypergraph-based representation learning. Concretely, inspired by the information bottleneck principle for the robustness issue, we first extend it to the hypergraph case, named by the hypergraph information bottleneck (HIB) principle. Then, we apply this principle to guide the hypergraph structure learning, where the HIB is introduced to construct the loss function to minimize the noisy information in the hypergraph structure. The hypergraph structure can be optimized and this process can be regarded as enhancing the correct connections and weakening the wrong connections in the training phase. Therefore, the proposed method benefits to extract more robust representations even on a heavily noisy structure. Finally, we evaluate the model on four benchmark datasets for representation learning. The experimental results on both graph- and hypergraph-structured data demonstrate the effectiveness and robustness of our method compared with other state-of-the-art methods.
翻译:高阶关系方面的学习在数据代表性学习中表现出优越性,近几十年来广泛使用高压数据,因此,在高阶关系学习中,高压数据学习是一项紧迫而重要的任务。为解决这一问题,我们设计了一个基于高压代表性学习方法的一般范例,即深层HGSL,以优化高压代表性学习的高级结构。具体地说,如何在数据中生成高压结构仍然是一项艰巨的任务。缺少和吵闹的数据可能导致高压结构的“不良连接”并破坏高压代表性学习过程。因此,披露高压结构,即所观察到的数据背后的高压结构,成为一项紧迫而重要的任务。为了解决这一问题,我们设计了一个深层高压结构学习方法,即深层HGSL,以优化高压代表性学习结构的质量。具体地说,在强势关系问题的信息瓶颈原则的启发下,我们首先将数据扩展到高压结构中,然后我们运用这一原则来指导高压结构的学习,然后引入HIB国家来构建损失功能,以最大限度地减少高压结构结构结构结构中的信息,甚至将高压结构结构结构结构中的准确性信息进行优化。 高压数据结构的高级分析方法可以被优化地认为,在高压关系中,在高压数据结构上,在高压数据结构上,在高压数据结构上,高压数据结构上,在高压分析中可以优化地分析方法上,在高压化的研修修修修修修。高压数据结构上,高压数据结构上,在高压数据结构上,高压数据结构上,在最后的研磨。高压数据结构上,在高压数据结构上,在高压方法上,在高压方法上,在高压数据结构上,在高压数据结构上,在高压方法上,在高压数据结构上,在高压数据结构上,在高压研究方法上,在高压数据结构上,在高压分析方法上,在高压分析方法上,在高压结构上,在高压分析方法上,在高压结构上可以被视为上,在高压数据结构上,可以优化后修方法上,在高压数据结构上,在高压数据结构上,在高压数据结构上,可以优化后修方法上,在高压结构上,在高压数据结构上