In this paper, we introduce a randomized QLP decomposition called Rand-QLP. Operating on a matrix $\bf A$, Rand-QLP gives ${\bf A}={\bf QLP}^T$, where $\bf Q$ and $\bf P$ are orthonormal, and $\bf L$ is lower-triangular. Under the assumption that the rank of the input matrix is $k$, we derive several error bounds for Rand-QLP: bounds for the first $k$ approximate singular values and for the trailing block of the middle factor $\bf L$, which show that the decomposition is rank-revealing; bounds for the distance between approximate subspaces and the exact ones for all four fundamental subspaces of a given matrix; and bounds for the errors of low-rank approximations constructed by the columns of $\bf Q$ and $\bf P$. Rand-QLP is able to effectively leverage modern computational architectures, due to the utilization of random sampling and the unpivoted QR decomposition, thus addressing a serious bottleneck associated with classical algorithms such as the singular value decomposition (SVD), column-pivoted QR (CPQR) and most recent matrix decomposition algorithms. To assess the performance behavior of different algorithms, we use an Intel Xeon Gold 6240 CPU running at 2.6 GHz with a NVIDIA GeForce RTX 2080Ti GPU. In comparison to CPQR and the SVD, Rand-QLP respectively achieves a speedup of up to 5 times and 6.6 times on the CPU and up to 3.8 times and 4.4 times with the hybrid GPU architecture. In terms of quality of approximation, our results on synthetic and real data show that the approximations by Rand-QLP are comparable to those of pivoted QLP and the optimal SVD, and in most cases are considerably better than those of CPQR.
翻译:在本文中, 我们引入了随机的 QLP P 分解, 叫做 Rand- QLP 。 在一个矩阵中运行 $\ bf A$, Rand- QLP 提供 $ bf A ⁇ bf A ⁇ bf QLP QLPT$, 其中$\bf Q美元和 $bf P美元是正态的, $\bf LP 是一个低色方位。 根据输入矩阵的级别为 k$, 我们为 Rand- QP 得出了几个错误 。 在第一个基数中运行 $k$ 约奇数, RLP 显示的是中位值 $\bf A$, Rd- bf L$, 显示分数的分数值是分数; 在一个基数基数基数中, 大约的子空间和精确空间之间的距离是 ; 在一个基数中, 以 美元\bf QF Q 和 数的直径直径直径直径直径直径直到 的直径直径直径的直径直径直到 。