The Stochastic Primal-Dual Hybrid Gradient or SPDHG is an algorithm proposed by Chambolle et al. to efficiently solve a wide class of nonsmooth large-scale optimization problems. In this paper we contribute to its theoretical foundations and prove its almost sure convergence for convex but neither necessarily strongly convex nor smooth functionals, defined on Hilbert spaces of arbitrary dimension. We also prove its convergence for any arbitrary sampling, and for some specific samplings we propose theoretically optimal step size parameters which yield faster convergence. In addition, we propose using SPDHG for parallel Magnetic Resonance Imaging reconstruction, where data from different coils are randomly selected at each iteration. We apply SPDHG using a wide range of random sampling methods. We compare its performance across a range of settings, including mini-batch size, step size parameters, and both convex and strongly convex objective functionals. We show that the sampling can significantly affect the convergence speed of SPDHG. We conclude that for many cases an optimal sampling method can be identified.
翻译:Chambolle等人建议采用软盘-软盘混合梯度或SPDHG算法,以有效解决一系列非软盘大型优化问题。在本文中,我们为其理论基础作出贡献,并证明它几乎肯定地融合了在Hilbert任意尺寸空格上定义的二次曲线,但不一定具有很强的二次曲线或光滑功能。我们还证明它与任意采样和某些特定取样相趋同,我们为任何任意采样和某些特定采样建议了理论上最优化的步数参数,从而产生更快的趋同速度。此外,我们建议使用SPDHG进行平行磁共振成像学重建,在每次迭代中随机选择不同波圈的数据。我们应用SPDHG使用广泛的随机采样方法。我们比较了它在不同环境的性能,包括微型批量大小、级尺寸参数,以及二次曲线和强烈的二次曲线目标功能。我们发现,取样可以显著地影响SPDHG的趋同速度。我们得出结论,对于许多情况可以确定一种最佳采样方法。