Momentum Iterative Hessian Sketch (M-IHS) techniques, a group of solvers for large scale regularized linear Least Squares (LS) problems, are proposed and analyzed in detail. Proposed M-IHS techniques are obtained by incorporating the Heavy Ball Acceleration into the Iterative Hessian Sketch algorithm and they provide significant improvements over the randomized preconditioning techniques. By using approximate solvers along with the iterations, the proposed techniques are capable of avoiding all matrix decompositions and inversions, which is one of the main advantages over the alternative solvers such as the Blendenpik and the LSRN. Similar to the Chebyshev semi-iterations, the M-IHS variants do not use any inner products and eliminate the corresponding synchronization steps in hierarchical or distributed memory systems, yet the M-IHS converges faster than the Chebyshev Semi-iteration based solvers. Lower bounds on the required sketch size for various randomized distributions are established through the error analyses. Unlike the previously proposed approaches to produce a solution approximation, the proposed M-IHS techniques can use sketch sizes that are proportional to the statistical dimension which is always smaller than the rank of the coefficient matrix. The relative computational saving gets more significant as the regularization parameter or the singular value decay rate of the coefficient matrix increase.
翻译:提出并详细分析大规模正规化线性最低方(LS)问题的一组解决者,即大规模正规化线性线性最小方(M-IHS)技术。提议的M-IHS技术是通过将重球加速器纳入热性热性Sletch算法而获得的,它们为随机化先决条件技术提供了显著改进。通过使用近似解答器和迭代法,拟议的技术能够避免所有矩阵分解和反转,这是相对于替代解决者(如Blendenpik和LSRN)的主要优势之一。与Chebyshev半标准类似,M-IHS变异体不使用任何内部产品,消除了等级或分布式记忆系统中相应的同步步骤,但是M-IHS与随机化前置技术相近。通过错误分析,确定了各种随机分布所需草图尺寸的较低界限。与以前提出的生成解决方案近似度近似方法不同,拟议的M-IHS技术与Chebyshev 和LSRN。与Chebyshev半Serv半标准基数的相对性标准, 标准化指数的比标准比标准性指数的比标准的比标准性指数的比标准性指数的比标准要高。M-IHSermaxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx