We propose \emph{neighborhood-based core decomposition}: a novel way of decomposing hypergraphs into hierarchical neighborhood-cohesive subhypergraphs. Alternative approaches to decomposing hypergraphs, e.g., reduction to clique or bipartite graphs, are not meaningful in certain applications, the later also results in inefficient decomposition; while existing degree-based hypergraph decomposition does not distinguish nodes with different neighborhood sizes. Our case studies show that the proposed decomposition is more effective than degree and clique graph-based decompositions in disease intervention and in extracting provably approximate and application-wise meaningful densest subhypergraphs. We propose three algorithms: \textbf{Peel}, its efficient variant \textbf{E-Peel}, and a novel local algorithm: \textbf{Local-core} with parallel implementation. Our most efficient parallel algorithm \textbf{Local-core(P)} decomposes hypergraph with 27M nodes and 17M hyperedges in-memory within 91 seconds by adopting various optimizations. Finally, we develop a new hypergraph-core model, the (neighborhood, degree)-core} by considering both neighborhood and degree constraints, design its decomposition algorithm \textbf{Local-core+Peel}, and demonstrate its superiority in spreading diffusion.
翻译:我们建议 \ emph{ 以邻居为基础的核心分解 } : 一种将高光度分解成高光谱的新型方法, 将高光谱分解成等级邻里相交分流的亚高光谱。 将高光谱分解的替代方法, 例如减少分层或双叶图, 在某些应用中没有意义, 后一种还导致低效分解; 虽然现有基于度的高光谱分解并不区分与不同邻里大小的节点。 我们的案例研究显示, 拟议的分解比在疾病干预和提取可感知和可感知的近光度和应用的近度和低密度子高光谱分解的分解法更有效。 我们建议三种算法: \ textf{ E- Peel}, 其高效变异样 \ textbf{ i- colorcolorcoloral 运算法 与平行执行。 我们最高效的平行算法 \ textb{ { lodical- core- core-com) 将高光图分解分解为27M 的分解, 和可感化的分解分解分解法, 我们的分解的分解的分解的分解的分解在91的分级的分级的分级的分级的分级的分级的分级的分级 和分级的分级的分级,, 和分级的分级的分级 和分级的分级 和分级的分级制级的分级的分级的分级的分级 。