We present a Newton-type method that converges fast from any initialization and for arbitrary convex objectives with Lipschitz Hessians. We achieve this by merging the ideas of cubic regularization with a certain adaptive Levenberg--Marquardt penalty. In particular, we show that the iterates given by $x^{k+1}=x^k - \bigl(\nabla^2 f(x^k) + \sqrt{H\|\nabla f(x^k)\|} \mathbf{I}\bigr)^{-1}\nabla f(x^k)$, where $H>0$ is a constant, converge globally with a $\mathcal{O}(\frac{1}{k^2})$ rate. Our method is the first variant of Newton's method that has both cheap iterations and provably fast global convergence. Moreover, we prove that locally our method converges superlinearly when the objective is strongly convex. To boost the method's performance, we present a line search procedure that does not need prior knowledge of $H$ and is provably efficient.
翻译:我们展示了一种与初始化和任意混凝土目标快速结合的牛顿型方法。 我们通过将立方正规化理念与某种适应性Levenberg- Marquardt 罚款相结合的方式来做到这一点。 特别是, 我们展示了由 $x<unk> k+1<unk> <unk> x<unk> k -\ bigl( nabla) 2 f( x}k) +\ sqrt{H} nabla f( x<unk> k)\\ mathbf{ I<unk> biger) <unk> -1<unk> nabla f( x<unk> k)$( $>0) 是恒定的, 与某种适应性Levenberg- Marquardt 处罚相结合。 我们的方法是牛顿方法的第一个变体, 既低的迭代值, 也快速的全球趋同 。 此外, 我们证明本地的方法在目标非常强烈的 convex 的情况下, 将超线性趋同。 为了提高方法的性能性能, 我们展示了一条前项搜索程序。</s>