We study the effect of Johnson-Lindenstrauss transforms in various Euclidean optimization problems. We ask, for a particular problem and an accuracy parameter $\epsilon \in (0, 1)$, what is the smallest target dimension $t \in \mathbb{N}$ such that a Johnson-Lindenstrauss transform $\Pi \colon \mathbb{R}^d \to \mathbb{R}^t$ preserves the cost of the optimal solution up to a $(1+\epsilon)$-factor. $\bullet$ For center-based $(k,z)$-clustering, we show $t = O( (\log k + z \log(1/\epsilon)) / \epsilon^2)$ suffices, improving on $O((\log k + z\log(1/\epsilon) + z^2)/\epsilon^2)$ [MMR19]. $\bullet$ For $(k,z)$-subspace approximation, we show $t = \tilde{O}(zk^2 / \epsilon^3)$ suffices. The prior best bound, of $O(k/\epsilon^2)$, only applied to the case $z = 2$ [CEMMP15]. $\bullet$ For $(k,z)$-flat approximation, we show $t = \tilde{O}(zk^2/\epsilon^3)$ suffices, improving on a bound of $\tilde{O}(zk^2 \log n/\epsilon^3)$ [KR15]. $\bullet$ For $(k,z)$-line approximation, we show $t = O((k \log \log n + z + \log(1/\epsilon)) / \epsilon^3)$ suffices. No prior results were known. All the above results follow from one general technique: we use algorithms for constructing coresets as an analytical tool in randomized dimensionality reduction.
翻译:我们研究的是 Johnson- Lindenstraus 变换在各种 Euclide 优化问题中的效果。 对于特定的问题和精确参数 $\ epsilon\ in (0, 1美元), 我们要求的是最小的目标维度$t\ mathb{ N} 美元, 这样 John- Lindenstraus 变换$\ Pi \ colone{R\ d\ to\ mathbrbrb} 将最佳解决方案的成本维持在$ (1\ epsilon) $ 。 $\ bullet$ $(k, z) 最小目标维值$( k k k) 降低 /\ epsilon2美元, 仅改善$( kk k + zlog (1/\ epsilon) = 美元。