Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is a practical tool to solve nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. The proof makes use of a classical supermartingale result, and also rewrites the algorithm as a sequence of random continuous operators in the primal-dual space. We compare our analysis with a similar argument by Alacaoglu et al., and give sufficient conditions for an unproven claim in their proof.
翻译:Chambolle等人(2018年)提出,这是解决非悬浮大型优化问题的实用工具。在本文中,我们证明了它几乎可以肯定的对二次曲线功能的趋同,但不一定具有很强的二次曲线功能。该证据使用了经典的超边曲线结果,还重写了算法,作为原始-双层空间随机连续操作器的序列。我们比较了我们的分析与Alacaoglu等人的类似论点,并提供了充分的条件,以证明它们提出了未经证实的主张。