We give the first almost-linear time algorithm for computing the \emph{maximal $k$-edge-connected subgraphs} of an undirected unweighted graph for any constant $k$. More specifically, given an $n$-vertex $m$-edge graph $G=(V,E)$ and a number $k = \log^{o(1)}n$, we can deterministically compute in $O(m+n^{1+o(1)})$ time the unique vertex partition $\{V_{1},\dots,V_{z}\}$ such that, for every $i$, $V_{i}$ induces a $k$-edge-connected subgraph while every superset $V'_{i}\supset V_{i}$ does not. Previous algorithms with linear time work only when $k\le2$ {[}Tarjan SICOMP'72{]}, otherwise they all require $\Omega(m+n\sqrt{n})$ time even when $k=3$ {[}Chechik~et~al.~SODA'17; Forster~et~al.~SODA'20{]}. Our algorithm also extends to the decremental graph setting; we can deterministically maintain the maximal $k$-edge-connected subgraphs of a graph undergoing edge deletions in $m^{1+o(1)}$ total update time. Our key idea is a reduction to the dynamic algorithm supporting pairwise $k$-edge-connectivity queries {[}Jin and Sun FOCS'20{]}.
翻译:暂无翻译