In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods using variance reduction to solve the smooth subproblem. One main feature of SAS-ADMM {is} that its dual variable is symmetrically updated after each update of the separated primal variable, which would allow a more flexible and larger convergence region of the dual variable compared with that of standard deterministic or stochastic ADMM. This new stochastic optimization algorithm is shown to converge in expectation with $\C{O}(1/T)$ convergence rate, where $T$ is the number of outer iterations. In addition, 3-block extensions of the algorithm and its variant of an accelerated stochastic augmented Lagrangian method are also discussed. Our preliminary numerical experiments indicate the proposed algorithm is very effective for solving separable optimization problems from big-data applications
翻译:在本文中,我们开发了一种对称加速随机交错的倍增效应感方向法(SAS-ADMMM),以解决线性制约下分解的二次优化问题。客观功能是可能的非单向共流函数和许多顺流共流函数的平均函数之和。我们提议的算法结合了ADMM的想法和加速相交梯度方法的技术,使用差异减少来解决平滑的子问题。SAS-ADMM {是SAS-ADMM}的主要特征之一是,其双重变量在分离的初等变量每次更新后都对称性地更新,这样可以使双重变量与标准的确定性或随机共振性ADMM相比有一个更加灵活和更大的趋同区域。我们提出的这种新的随机优化算法在预期中与$\C{O}(1/T)合为一美元,其中美元是外转数。此外,还讨论了该算法的3个区扩展及其加速分解增强拉格的变体的变体,这样可以使双重变量与标准的确定性变体与标准的Adalalal-al-almadalmas for for for palismagration for for pal for pal for aplistable for aplistal