We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate. It combines the favorable data-efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. We establish the first local convergence guarantee from a minimal number of samples for that class of algorithms, showing that the method attains a local quadratic convergence rate. Furthermore, we show that the linear systems to be solved are well-conditioned even for very ill-conditioned ground truth matrices. We provide extensive experiments, indicating that unlike many state-of-the-art approaches, our method is able to complete very ill-conditioned matrices with a condition number of up to $10^{10}$ from few samples, while being competitive in its scalability.
翻译:我们提出低级矩阵完成率的迭接算法,可以被解释为迭代再加权最低正方(IRLS)算法、马垫滑动牛顿法或适用于非convex级替代装置的可变指标准梯度法。它把以前IRLS方法的有利数据效率与通过几个数量级改进的可缩放性结合起来。我们从这一类算法的少量样本中建立了第一个本地趋同保证,表明该方法达到了本地四面形趋同率。此外,我们表明,要解决的线性系统即使是条件极差的地面真象矩阵也是很好的条件。我们提供了广泛的实验,表明与许多最先进的方法不同,我们的方法能够完成条件极差的矩阵,其条件从少数样本中高达10 ⁇ 10美元,同时在可缩放性方面具有竞争力。