Stochastic Primal-Dual Hybrid Gradient (SPDHG) is an algorithm to efficiently solve a wide class of nonsmooth large-scale optimization problems. In this paper we contribute to its theoretical foundations and prove its almost sure convergence for convex but neither necessarily strongly convex nor smooth functionals. We also prove its convergence for any sampling. In addition, we study SPDHG for parallel Magnetic Resonance Imaging reconstruction, where data from different coils are randomly selected at each iteration. We apply SPDHG using a wide range of random sampling methods and compare its performance across a range of settings, including mini-batch size and step size parameters. We show that the sampling can significantly affect the convergence speed of SPDHG and for many cases an optimal sampling can be identified.
翻译:随机原始-对偶混合梯度算法(SPDHG)是一种有效解决广泛的非光滑大规模优化问题的算法。在本文中,我们为其理论基础做出了贡献,并证明了其凸但不一定强凸或光滑泛函的几乎必然收敛性,以及其对任何采样的收敛性。此外,我们研究了并行磁共振成像重建中的 SPDHG,其中在每次迭代中随机选择来自不同线圈的数据。我们使用各种随机采样方法应用 SPDHG,并在各种设置中比较其性能,包括小批量大小和步长参数。我们表明,采样可以显著影响 SPDHG 的收敛速度,并且对于许多情况可以确定最佳采样。