This paper considers a large class of problems where we seek to recover a low rank matrix and/or sparse vector from some set of measurements. While methods based on convex relaxations suffer from a (possibly large) estimator bias, and other nonconvex methods require the rank or sparsity to be known a priori, we use nonconvex regularizers to minimize the rank and $l_0$ norm without the estimator bias from the convex relaxation. We present a novel analysis of the alternating proximal gradient descent algorithm applied to such problems, and bound the error between the iterates and the ground truth sparse and low rank matrices. The algorithm and error bound can be applied to sparse optimization, matrix completion, and robust principal component analysis as special cases of our results.
翻译:本文考虑了我们试图从某些测量中恢复低级矩阵和/或稀有矢量的大量问题; 以曲线松动为基础的方法存在(可能很大)估测偏差,以及其他非曲线方法要求先知等级或宽度; 我们使用非曲线正规化方法来尽量减少等级和0.0美元标准,而没有从曲线松动的估测偏差。 我们对适用于这些问题的交替的准梯度梯度下降算法进行了新颖的分析,并约束了河流与地面真理分散和低等级矩阵之间的错误。 算法和误差可以适用于稀有的优化、矩阵完成和强力主要组成部分分析,作为我们结果的特殊案例。