Low-rank matrix approximation is one of the central concepts in machine learning, with applications in dimension reduction, de-noising, multivariate statistical methodology, and many more. A recent extension to LRMA is called low-rank matrix completion (LRMC). It solves the LRMA problem when some observations are missing and is especially useful for recommender systems. In this paper, we consider an element-wise weighted generalization of LRMA. The resulting weighted low-rank matrix approximation technique therefore covers LRMC as a special case with binary weights. WLRMA has many applications. For example, it is an essential component of GLM optimization algorithms, where an exponential family is used to model the entries of a matrix, and the matrix of natural parameters admits a low-rank structure. We propose an algorithm for solving the weighted problem, as well as two acceleration techniques. Further, we develop a non-SVD modification of the proposed algorithm that is able to handle extremely high-dimensional data. We compare the performance of all the methods on a small simulation example as well as a real-data application.
翻译:低空矩阵近似值是机器学习的核心概念之一,其应用在降低维度、去除音量、多变统计方法以及许多其他方面。最近对LRMA的扩展称为低级矩阵完成(LMC) 。当一些观测缺失时,它解决了LRMA问题,对建议系统特别有用。在本文中,我们考虑对LRMA作一个元素加权加权概括。因此,由此产生的加权低级矩阵近近近似技术将LRMC作为二进制重量的一个特例。WLIM有许多应用。例如,它是GLM优化算法的一个基本组成部分,在这种算法中,使用一个指数式组来模拟矩阵的输入,而自然参数矩阵则采用低级结构。我们提出了解决加权问题的算法,以及两种加速技术。此外,我们对能够处理极高维度数据的拟议算法进行了非SVD的修改。我们比较了小模拟示例中所有方法的性能以及真实数据应用。