The task of predicting missing entries of a matrix, from a subset of known entries, is known as \textit{matrix completion}. In today's data-driven world, data completion is essential whether it is the main goal or a pre-processing step. Structured matrix completion includes any setting in which data is not missing uniformly at random. In recent work, a modification to the standard nuclear norm minimization (NNM) for matrix completion has been developed to take into account \emph{sparsity-based} structure in the missing entries. This notion of structure is motivated in many settings including recommender systems, where the probability that an entry is observed depends on the value of the entry. We propose adjusting an Iteratively Reweighted Least Squares (IRLS) algorithm for low-rank matrix completion to take into account sparsity-based structure in the missing entries. We also present an iterative gradient-projection-based implementation of the algorithm that can handle large-scale matrices. Finally, we present a robust array of numerical experiments on matrices of varying sizes, ranks, and level of structure. We show that our proposed method is comparable with the adjusted NNM on small-sized matrices, and often outperforms the IRLS algorithm in structured settings on matrices up to size $1000 \times 1000$.
翻译:从已知条目子集中预测矩阵缺失条目的任务被称为 \ textit{matrix 完成} 。 在今天的数据驱动世界中,无论是主要目标还是预处理步骤,数据完成至关重要。 结构矩阵完成包括数据并非完全随机缺失的任何设置。 在最近的工作中, 已经对标准核规范最小化( NNM ) 进行修改, 以便完成矩阵, 以考虑到缺失条目中的基于质量的算法结构。 这种结构概念在包括推荐者系统在内的许多环境中都有动因, 其中观察到一个条目的概率取决于条目的价值。 我们建议对低位矩阵完成率调整一个自动再加权最小广场(IRLS)的算法, 以考虑到缺失条目中基于偏差的结构结构。 我们还提出了一个基于反复的梯度预测, 执行能够处理大型矩阵的算法。 最后, 我们展示了对不同大小、 级别和结构层次的矩阵进行强有力的数字实验。 我们表明,我们拟议的方法在10000 IMIS 上结构矩阵中, 通常与10000 IMIS 结构矩阵中, 的NNNNM 级比小。