We analyze several generic proximal splitting algorithms well suited for large-scale convex nonsmooth optimization. We derive sublinear and linear convergence results with new rates on the function value suboptimality or distance to the solution, as well as new accelerated versions, using varying stepsizes. In addition, we propose distributed variants of these algorithms, which can be accelerated as well. While most existing results are ergodic, our nonergodic results significantly broaden our understanding of primal-dual optimization algorithms.
翻译:我们分析了几个非常适合大规模二次曲线非mooth优化的通用近似分解算法。 我们得出了亚线性和线性趋同结果,功能值亚优化或与解决方案距离的新率,以及使用不同阶梯的新加速版本。 此外,我们提出了这些算法的分布式变种,这些变种也可以加速。 虽然大多数现有结果都是异变,但我们的非正统结果极大地扩大了我们对原始-双优化算法的理解。