The standard randomized sparse Kaczmarz (RSK) method is an algorithm to compute sparse solutions of linear systems of equations and uses sequential updates, and thus, does not take advantage of parallel computations. In this work, we introduce a parallel (mini batch) version of RSK based on averaging several Kaczmarz steps. Naturally, this method allows for parallelization and we show that it can also leverage large over-relaxation. We prove linear expected convergence and show that, given that parallel computations can be exploited, the method provably provides faster convergence than the standard method. This method can also be viewed as a variant of the linearized Bregman algorithm, a randomized dual block coordinate descent update, a stochastic mirror descent update, or a relaxed version of RSK and we recover the standard RSK method when the batch size is equal to one. We also provide estimates for inconsistent systems and show that the iterates convergence to an error in the order of the noise level. Finally, numerical examples illustrate the benefits of the new algorithm.
翻译:标准随机稀疏卡茨马尔兹( RSK) 方法是一种算法, 用来计算线性方程系统稀疏的解决方案, 并使用顺序更新, 因此, 无法利用平行计算。 在此工作中, 我们引入了一种平行的RSK( 微型批量), 平均以数个 Kaczmarz 步骤为基础。 自然, 这种方法允许平行化, 并显示它也可以产生巨大的过度松绑效应。 我们证明了线性预期趋同, 并表明, 鉴于平行计算可以被利用, 这种方法可以比标准方法更快地提供趋同。 这种方法也可以被视为线性布雷格曼算法的变式, 随机化的双块协调下行更新, 随机化的镜像下行更新, 或简易的 RSK 方法, 当批量大小等于一时, 我们回收标准 RSK 方法。 我们还为不一致的系统提供估计值, 并显示它与噪声水平的误差。 最后, 数字示例说明了新算法的好处 。