We consider optimization problems in which the goal is find a $k$-dimensional subspace of $\mathbb{R}^n$, $k<<n$, which minimizes a convex and smooth loss. Such problems generalize the fundamental task of principal component analysis (PCA) to include robust and sparse counterparts, and logistic PCA for binary data, among others. This problem could be approached either via nonconvex gradient methods with highly-efficient iterations, but for which arguing about fast convergence to a global minimizer is difficult or, via a convex relaxation for which arguing about convergence to a global minimizer is straightforward, but the corresponding methods are often inefficient in high dimensions. In this work we bridge these two approaches under a strict complementarity assumption, which in particular implies that the optimal solution to the convex relaxation is unique and is also the optimal solution to the original nonconvex problem. Our main result is a proof that a natural nonconvex gradient method which is \textit{SVD-free} and requires only a single QR-factorization of an $n\times k$ matrix per iteration, converges locally with a linear rate. We also establish linear convergence results for the nonconvex projected gradient method, and the Frank-Wolfe method when applied to the convex relaxation.
翻译:我们考虑的是优化问题,即目标在其中找到一个以美元为单位的维基空间,即$mathbb{R ⁇ n$,美元,以美元为单位,以最大限度地减少混凝土和平稳损失。这些问题概括了主要组成部分分析(PCA)的基本任务(PCA),包括强健和稀少的对应方,以及二进制数据的后勤工作。这个问题可以通过非convex梯度方法以高效的迭代方式加以解决,但很难找到与全球最小化器快速趋同的快速趋同,或者很难通过康克斯放松,为此,主张与全球最小化器趋同是直接的,但相应的方法往往在高层面效率方面是无效的。在这项工作中,我们在严格的互补假设下将这两种方法连接起来,这特别意味着,对Convex放松的最佳解决办法是独特的,也是解决原非convex问题的最佳办法。我们的主要结果证明,自然的非convex梯度梯度方法是 k-text {SVD-free},而且只需要一种美元为最小化的QR-fornal-colnalalgrolational rentalalal agresmlational dal dalation 。