We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any $p\ge 2$ we improve the complexity of convex optimization with Lipschitz $p$th derivative by a logarithmic factor, matching a lower bound. We also introduce an MS subproblem solver that requires no knowledge of problem parameters, and implement it as either a second- or first-order method by solving linear systems or applying MinRes, respectively. On logistic regression our method outperforms previous second-order momentum methods, but under-performs Newton's method; simply iterating our first-order adaptive subproblem solver performs comparably to L-BFGS.
翻译:我们开发了一个蒙泰罗-斯韦特加速框架的变体, 从而消除了在每个迭代中解决昂贵的隐含方程式的需要。 因此, 对于任何$p\ge 2 美元, 我们都会通过对数系数来提高使用利普西茨( $p$th) 衍生物的曲线优化的复杂性, 匹配一个较低约束值。 我们还引入一个MS 子问题解答器, 它不需要对问题参数的了解, 并且作为第二或第一等级方法分别实施, 解决线性系统或者应用 MinRes 。 在物流回归上, 我们的方法比之前的二阶动量方法要好, 但是不完善的牛顿方法; 简单地将我们的一级适应性子问题解答器与 L- BFGS 相匹配。