In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods possibly with variance reduction to solve the smooth subproblem. One main feature of SAS-ADMM is that its dual variable is symmetrically updated after each update of the separated primal variable, which would allow a more flexible and larger convergence region of the dual variable compared with that of standard deter-ministic or stochastic ADMM. This new stochastic optimization algorithm is shown to have ergodic converge in expectation with O(1/T) convergence rate, where T is the number of outer iterations. Our preliminary experiments indicate the proposed algorithm is very effective for solving separable optimization problems from big-data applications. Finally, 3-block extensions of the algorithm and its variant of an accelerated stochastic augmented Lagrangian method are discussed in the appendix.
翻译:在本文中,我们开发了一种对称加速随机交错的倍增效应方向法(SAS-ADMMM),以解决线性制约下分解的二次优化问题。客观功能是可能的非单向共流函数和许多顺流共流函数的平均函数之和。我们提议的算法结合了ADMM的想法和加速交错梯度方法的技术,这些方法可能具有差异性,以解决平滑的子问题。SAS-ADMMM的一个主要特征是,其双重变量在分离的初等变量每次更新后都得到对称性更新,这样可以使双重变量与标准确定性或随机共振的ADMM值相比有一个更灵活和更大的趋同区域。这种新的随机优化算法显示,在预期中,与O(1/T)趋同率的加速趋同率是外转数。我们的初步实验表明,拟议的算法对于解决大数据应用中可分离的优化问题非常有效。最后,在加速分析法中讨论的3个部分扩展及其变式是加速变式。