We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds with the following three properties: (1) they scale like the standard deviation of the noise and in particular approach zero in the exact recovery case; (2) even in the presence of noise, they converge to zero when the sample size approaches infinity; and (3) for a fixed dimension of the side information, they only have a logarithmic dependence on the size of the matrix. Differently from many works in approximate recovery, we present results both for bounded Lipschitz losses and for the absolute loss, with the latter relying on Talagrand-type inequalities. The proofs create a bridge between two approaches to the theoretical analysis of matrix completion, since they consist in a combination of techniques from both the exact recovery literature and the approximate recovery literature.
翻译:我们根据一种低噪音制度,以统一抽样方式,对低噪音假设进行感应矩阵的完成(配有侧面信息的矩阵完成)研究。我们第一次获得下列三种特性的概括性界限:(1)它们的规模像噪音的标准偏差一样,特别是在精确回收案例中采用零位法;(2)即使有噪音,当抽样规模接近无限度时,它们也接近于零;(3)对于侧面信息的一个固定尺寸,它们只对矩阵的大小有对数依赖性。与在大致回收方面的许多工作不同,我们提出的结果是受约束的Lipschitz损失和绝对损失,后者依赖Talagrand类型的不平等。这些证据在对矩阵完成的理论分析中架起了桥梁,因为它们是精确回收文献和近似回收文献中技术的组合。