In this paper, we tackle two important problems in low-rank learning, which are partial singular value decomposition and numerical rank estimation of huge matrices. By using the concepts of Krylov subspaces such as Golub-Kahan bidiagonalization (GK-bidiagonalization) as well as Ritz vectors, we propose two methods for solving these problems in a fast and accurate way. Our experiments show the advantages of the proposed methods compared to the traditional and randomized singular value decomposition methods. The proposed methods are appropriate for applications involving huge matrices where the accuracy of the desired singular values and also all of their corresponding singular vectors are essential. As a real application, we evaluate the performance of our methods on the problem of Riemannian similarity learning between two various image datasets of MNIST and USPS.
翻译:在本文中,我们解决了低级学习的两个重要问题,即:部分单值分解和巨型矩阵的数值级估计。通过使用诸如Golub-Kahan feriagonalization(GK-bidiagonalization)和Ritz矢量等Krylov子空间的概念,我们提出了快速和准确地解决这些问题的两种方法。我们的实验表明,与传统和随机的单值分解方法相比,拟议方法具有优势。拟议方法适合于大型矩阵的应用,其中所期望的单值的准确性及其所有对应的单向矢量都是必不可少的。作为一个实际应用,我们评估了我们有关里曼尼两个MNIST和USPS图像数据集之间类似性学习方法的绩效。