In the history of first-order algorithms, Nesterov's accelerated gradient descent (NAG) is one of the milestones. However, the cause of the acceleration has been a mystery for a long time. It has not been revealed with the existence of gradient correction until the high-resolution differential equation framework proposed in [Shi et al., 2021]. In this paper, we continue to investigate the acceleration phenomenon. First, we provide a significantly simplified proof based on precise observation and a tighter inequality for $L$-smooth functions. Then, a new implicit-velocity high-resolution differential equation framework, as well as the corresponding implicit-velocity version of phase-space representation and Lyapunov function, is proposed to investigate the convergence behavior of the iterative sequence $\{x_k\}_{k=0}^{\infty}$ of NAG. Furthermore, from two kinds of phase-space representations, we find that the role played by gradient correction is equivalent to that by velocity included implicitly in the gradient, where the only difference comes from the iterative sequence $\{y_{k}\}_{k=0}^{\infty}$ replaced by $\{x_k\}_{k=0}^{\infty}$. Finally, for the open question of whether the gradient norm minimization of NAG has a faster rate $o(1/k^3)$, we figure out a positive answer with its proof. Meanwhile, a faster rate of objective value minimization $o(1/k^2)$ is shown for the case $r > 2$.
翻译:在一阶算法史上, Nesterov 加速梯度下降(NAG) 是一个里程碑。 然而, 加速的原因长期以来一直是一个谜。 在[ Shi 等人, 2021] 提议的高分辨率差分方程框架之前, 梯度修正还没有显现出来。 在本文中, 我们继续调查加速现象 。 首先, 我们根据精确的观察和对 $L$ - moth 函数的更严格不平等, 提供了大大简化的证据。 然后, 一个新的隐含速度高分辨率差异方程框架, 以及相应的阶段- 空间代表和 Lyapunov 函数的隐含速度版本, 提议调查迭代序列 $x_ k ⁇ k=0 ⁇ infty} 的趋同行为。 此外, 从两种阶段- 空间演示中, 我们发现, 梯度修正所起的作用相当于在梯度中隐含的速度, 其中唯一的差异来自迭代序列 $_ y- k=0. Q_ lex_ k=xxxxx r= leg legn_ leg legral cal legal legal legard 。 legal legrelevol 。 legal_ lexxxxxxx_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx