Based on an observation that additive Schwarz methods for general convex optimization can be interpreted as gradient methods, we propose an acceleration scheme for additive Schwarz methods. Adopting acceleration techniques developed for gradient methods such as momentum and adaptive restarting, the convergence rate of additive Schwarz methods is greatly improved. The proposed acceleration scheme does not require any a priori information on the levels of smoothness and sharpness of a target energy functional, so that it can be applied to various convex optimization problems. Numerical results for linear elliptic problems, nonlinear elliptic problems, nonsmooth problems, and nonsharp problems are provided to highlight the superiority and the broad applicability of the proposed scheme.
翻译:根据一种看法,即施瓦兹添加剂的一般二次曲线优化方法可被解释为梯度方法,我们提议施瓦兹添加剂方法的加速计划;采用为梯度方法开发的加速技术,如动力和适应性重新启动、施瓦兹添加剂方法的趋同率大为提高;拟议的加速计划并不要求事先提供关于目标能源功能的平滑度和清晰度的任何信息,以便将其应用于各种二次曲线优化问题;提供了线性椭圆问题、非线性椭圆问题、非线性椭圆问题和非线性问题的数字结果,以突出拟议的计划的优越性和广泛适用性。