A recent work of Larsen [Lar23] gave a faster combinatorial alternative to Bansal's SDP algorithm for finding a coloring $x\in\{-1,1\}^n$ that approximately minimizes the discrepancy $\mathrm{disc}(A,x) : = \| A x \|_{\infty}$ of a general real-valued $m\times n$ matrix $A$. Larsen's algorithm runs in $\widetilde{O}(mn^2)$ time compared to Bansal's $\widetilde{O}(mn^{4.5})$-time algorithm, at the price of a slightly weaker logarithmic approximation ratio in terms of the hereditary discrepancy of $A$ [Ban10]. In this work we present a combinatorial $\widetilde{O}(\mathrm{nnz}(A) + n^3)$ time algorithm with the same approximation guarantee as Larsen, which is optimal for tall matrices $m=\mathrm{poly}(n)$. Using a more intricate analysis and fast matrix-multiplication, we achieve $\widetilde{O}(\mathrm{nnz}(A) + n^{2.53})$ time, which breaks cubic runtime for square matrices, and bypasses the barrier of linear-programming approaches [ES14] for which input-sparsity time is currently out of reach. Our algorithm relies on two main ideas: (i) A new sketching technique for finding a projection matrix with short $\ell_2$-basis using implicit leverage-score sampling; (ii) A data structure for faster implementation of the iterative Edge-Walk partial-coloring algorithm of Lovett-Meka, using an alternative analysis that enables ``lazy" batch-updates with low-rank corrections. Our result nearly closes the computational gap between real-valued and binary matrices (set-systems), for which input-sparsity time coloring was very recently obtained [JSS23].
翻译:Larsen [Lar23] 最近的一项工作提供了比Bansal的 SDP 算法更快捷的组合式替代方法, 以找到一个 $x\ in\\\ -1, 1\\\ 美元, 大约将差异最小化$\ mathrm{ disc} (A,x) : = $ A x \ ⁇ ⁇ infty} 美元, 普通实际价值$m=美元 矩阵$A美元。 Larsen 算法以$\ 全局23 de{ O} (mn) 美元运行比 Bansal 的 SDP 算法更快, 以略微弱的对调 $\ max discriversal_ comlistal comlistations dislations 。 使用更精确的对 IMFlickral_ Netrickral_ discoal_ discoal_ dal_ discoal_ disal_ lax a fral- dismal_ dismal_ disal_ disal_ disal_ disal_ disal_ disal_ disal_ disal_ disl_ disal_ disl_ disl_ disl_ d) 美元, laxxxxxxxxxxxxxxxxxxx