The Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is an efficient algorithm to solve some nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. We also look into its application to parallel Magnetic Resonance Imaging reconstruction in order to test performance of SPDHG. Our numerical results show that for a range of settings SPDHG converges significantly faster than its deterministic counterpart.
翻译:Chambolle等人(2018年)提出了 " 软骨-软骨混合梯(SPDHG) " (SPDHG),这是解决某些非悬浮大型优化问题的有效算法。在本文中,我们证明了它几乎可以肯定的共聚功能,但不一定具有很强的共聚功能。我们还研究了它与平行磁共振成像重建的应用,以测试SPDHG的性能。我们的数字结果表明,在一系列环境中,SPDHG的聚合速度大大快于其确定性对等。