The soft SVD is a robust matrix decomposition algorithm and a key component of matrix completion methods. However, computing the soft SVD for large sparse matrices is often impractical using conventional numerical methods for the SVD due to large memory requirements. The Rank-Restricted Soft SVD (RRSS) algorithm introduced by Hastie et al. addressed this issue by sequentially computing low-rank SVDs that easily fit in memory. We analyze the convergence of the standard RRSS algorithm and we give examples where the standard algorithm does not converge. We show that convergence requires a modification of the standard algorithm, and is related to non-uniqueness of the SVD. Our modification specifies a consistent choice of sign for the left singular vectors of the low-rank SVDs in the iteration. Under these conditions, we prove linear convergence of the singular vectors using a technique motivated by alternating subspace iteration. We then derive a fixed point iteration for the evolution of the singular values and show linear convergence to the soft thresholded singular values of the original matrix. This last step requires a perturbation result for fixed point iterations which may be of independent interest.
翻译:软 SVD 是强大的矩阵分解算法, 也是矩阵完成方法的关键组成部分。 但是, 使用常规数字方法计算大型稀薄矩阵的软 SVD 往往不切实际, 因为存储要求很大。 Hastie et al 引入的 Rank- Restricted SVD (RRSS) 算法, 由Hastie et al 引入的 RRVD 算法通过连续计算低级别 SVD 来解决这个问题。 我们分析标准 RRSS 算法的趋同, 并举标准算法不趋同的例子。 我们显示, 趋同需要修改标准算法, 并且与 SVD 的不统一有关。 我们的修改为在迭代中低级别 SVD 的左单向矢量规定了一致的标记选择。 在这些条件下, 我们证明单向矢量的线性趋同使用了由交替的子空间循环驱动的技术。 我们然后为单向值的演变得出一个固定点, 并显示向原始矩阵的软阈值的线性趋同值。 最后一步需要固定点的兴趣。