We study the effect of Johnson-Lindenstrauss transforms in various Euclidean optimization problems. We ask, for a particular problem and an accuracy parameter $\epsilon \in (0, 1)$, what is the smallest target dimension $t \in \mathbb{N}$ such that a Johnson-Lindenstrauss transform $\Pi \colon \mathbb{R}^d \to \mathbb{R}^t$ preserves the cost of the optimal solution up to a $(1+\epsilon)$-factor. $\bullet$ For center-based $(k,z)$-clustering, we show $t = O( (\log k + z \log(1/\epsilon)) / \epsilon^2)$ suffices, improving on $O(z^4 \log(k/\epsilon)/\epsilon^2)$ [MMR19]. $\bullet$ For $(k,z)$-subspace approximation, we show $t = \tilde{O}(zk^2 / \epsilon^3)$ suffices. The prior best bound, of $O(k/\epsilon^2)$, only applied to the case $z = 2$ [CEMMP15]. $\bullet$ For $(k,z)$-flat approximation, we show $t = \tilde{O}(zk^2/\epsilon^3)$ suffices, improving on a bound of $\tilde{O}(zk^2 \log n/\epsilon^3)$ [KR15]. $\bullet$ For $(k,z)$-line approximation, we show $t = O((k \log \log n + z + \log(1/\epsilon)) / \epsilon^3)$ suffices. No prior results were known. All the above results follow from one general technique: we use algorithms for constructing coresets as an analytical tool in randomized dimensionality reduction.
翻译:我们研究的是 Johnson- Lindenstraus 的变异效应 各种 Euclide 优化问题 。 对于特定的问题和精确参数 $\ epsilon\ in (0, 1美元), 我们要求的是最小的目标维度$t\ mathb{ N}, 这样 John- Lindenstraus 将 $\ Pi \ mathb{R\ d\ to\ mathbrb{R} 保存最佳解决方案的成本, 最高为 $( 1 \ epsilon) 美元 。 对于以 $ (k) 美元 和 美元 美元= 美元= 美元= = 美元= 美元= 美元= 美元= Ot = ylog( k) 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= a (z) 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= 美元= lexxxxx a a ( = 美元= = 美元= 美元= = 美元= = 美元= = = = 美元= = = 美元= = = = = 美元= = = = = 美元= = = = = = = 美元= 美元= = 美元= = = = = = 美元= = = = = = = = = = = = 美元= = = 美元= = = 美元= = = = = 美元= = = = = = = = = = = = = = = = 美元= 美元= = = = 美元 美元= = = = = = = = = = = 美元= 美元= 美元= 美元= 美元= 美元 美元 美元 美元= = = = = =