We propose in this paper a new minimization algorithm based on a slightly modified version of the scalar auxiliary variable (SAV) approach coupled with a relaxation step and an adaptive strategy. It enjoys several distinct advantages over popular gradient based methods: (i) it is unconditionally energy diminishing with a modified energy which is intrinsically related to the original energy, thus no parameter tuning is needed for stability; (ii) it allows the use of large step-sizes which can effectively accelerate the convergence rate. We also present a convergence analysis for some SAV based algorithms, which include our new algorithm without the relaxation step as a special case. We apply our new algorithm to several illustrative and benchmark problems, and compare its performance with several popular gradient based methods. The numerical results indicate that the new algorithm is very robust, and its adaptive version usually converges significantly faster than those popular gradient descent based methods.
翻译:在本文中,我们建议采用一个新的最小化算法,其基础是略微修改的标量辅助变量(SAV)方法,加上一个放松的步骤和适应性战略。它比流行的梯度方法具有若干明显的优势:(一) 能源是无条件减少的,而改变的能源与原始能源有着内在的联系,因此不需要为稳定而进行参数调整;(二) 它允许使用大步量的算法,从而有效地加快汇合率。我们还对某些基于SAV的算法进行了趋同分析,其中包括我们的新算法,而没有放松的步骤,作为特殊情况。我们将我们的新算法应用于几个示例和基准问题,并将其性能与几种流行的梯度方法进行比较。数字结果显示,新的算法非常稳健,其适应性版本通常比流行的梯度下降法要快得多。