Computing a dense subgraph is a fundamental problem in graph mining, with a diverse set of applications ranging from electronic commerce to community detection in social networks. In many of these applications, the underlying context is better modelled as a weighted hypergraph that keeps evolving with time. This motivates the problem of maintaining the densest subhypergraph of a weighted hypergraph in a {\em dynamic setting}, where the input keeps changing via a sequence of updates (hyperedge insertions/deletions). Previously, the only known algorithm for this problem was due to Hu et al. [HWC17]. This algorithm worked only on unweighted hypergraphs, and had an approximation ratio of $(1+\epsilon)r^2$ and an update time of $O(\text{poly} (r, \log n))$, where $r$ denotes the maximum rank of the input across all the updates. We obtain a new algorithm for this problem, which works even when the input hypergraph is weighted. Our algorithm has a significantly improved (near-optimal) approximation ratio of $(1+\epsilon)$ that is independent of $r$, and a similar update time of $O(\text{poly} (r, \log n))$. It is the first $(1+\epsilon)$-approximation algorithm even for the special case of weighted simple graphs. To complement our theoretical analysis, we perform experiments with our dynamic algorithm on large-scale, real-world data-sets. Our algorithm significantly outperforms the state of the art [HWC17] both in terms of accuracy and efficiency.
翻译:计算密度大的子图是图形开采中的一个基本问题, 这个问题的唯一已知算法是Hu et al. [HWC17] 。 这种算法只对未加权的超强法起作用, 其基底环境比重为$( 1 ⁇ epsilon) r2$ 的加权超强率, 并随着时间的演变而不断演变。 这引发了在 {em 动态 设置 } 中保持加权超重的最稠密次精度问题, 输入通过更新序列( 高端插入/ 删除) 不断变化。 之前, 这个问题的唯一已知算法是Hu 和 Hu el. [HWCWC17] 。 这种算法只对未加权的超重体超标工作, 其接近率为$( 1 ⁇ epsilon) r2$ 和 $ (r) loadalalalalal- transalialation 和 美元( 美元) 美元( tal- tralal_x) lax a dealalalalalal exalalalalalalalalalalalal dal) lax a ex dal- ex daltiquex a ex ex exal ex exal exal ex ex ex exalxxxxxxx 美元, 美元, 美元, 美元和美元( 美元) 美元( exal- taltial_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx