We present an optimal gradient method for smooth strongly convex optimization. The method is optimal in the sense that its worst-case bound on the distance to an optimal point exactly matches the lower bound on the oracle complexity for the class of problems, meaning that no black-box first-order method can have a better worst-case guarantee without further assumptions on the class of problems at hand. In addition, we provide a constructive recipe for obtaining the algorithmic parameters of the method and illustrate that it can be used for deriving methods for other optimality criteria as well.
翻译:我们提出了一种最优梯度法,以利顺利地大大优化曲线。这种方法最优化,因为其最坏的情况在距离最优点上与最差的情况完全吻合,与问题类别中最困难的孔雀复杂程度的较低限制完全吻合,这意味着任何黑盒第一级方法都不可能有更好的最坏情况保证,而无需对目前问题类别作出进一步假设。此外,我们为获得该方法的算法参数提供了建设性的食谱,并表明它也可以用于为其他最优标准得出方法。