We propose an adaptive time step with energy for a large class of preconditioned gradient descent methods, mainly applied to constrained optimization problems. Our strategy relies on representing the usual descent direction by the product of an energy variable and a transformed gradient, with a preconditioning matrix, for example, to reflect the natural gradient induced by the underlying metric in parameter space or to endow a projection operator when linear equality constraints are present. We present theoretical results on both unconditional stability and convergence rates for three respective classes of objective functions. In addition, our numerical results shed light on the excellent performance of the proposed method on several benchmark optimization problems.
翻译:暂无翻译