This paper presents a novel backtracking strategy for additive Schwarz methods for general convex optimization problems as an acceleration scheme. The proposed backtracking strategy is independent of local solvers, so that it can be applied to any algorithms that can be represented in an abstract framework of additive Schwarz methods. Allowing for adaptive increasing and decreasing of the step size along the iterations, the convergence rate of an algorithm is greatly improved. Improved convergence rate of the algorithm is proven rigorously. In addition, combining the proposed backtracking strategy with a momentum acceleration technique, we propose a further accelerated additive Schwarz method. Numerical results for various convex optimization problems that support our theory are presented.
翻译:本文介绍了一种新型的施瓦兹添加剂方法的回溯性战略,作为加速计划,用于处理一般顺流优化问题。提议的回溯性战略独立于当地解决者,从而可以适用于在添加剂施瓦兹方法的抽象框架内可以体现的任何算法。允许沿迭代增加和减少步骤尺寸,一种算法的趋同率得到极大提高。这一算法的趋同率得到严格证明。此外,将拟议的回溯性战略与加速动力技术相结合,我们建议了另一种加速施瓦兹方法。提出了支持我们理论的各种康韦克斯优化问题的数字结果。