We show that standard extragradient methods (i.e. mirror prox and dual extrapolation) recover optimal accelerated rates for first-order minimization of smooth convex functions. To obtain this result we provide a fine-grained characterization of the convergence rates of extragradient methods for solving monotone variational inequalities in terms of a natural condition we call relative Lipschitzness. We further generalize this framework to handle local and randomized notions of relative Lipschitzness and thereby recover rates for box-constrained $\ell_\infty$ regression based on area convexity and complexity bounds achieved by accelerated (randomized) coordinate descent for smooth convex function minimization.
翻译:我们展示了标准超常方法(即镜像流速率和双重外推法)恢复了最佳速率,以便一阶最大限度地减少光滑锥形功能。为了获得这一结果,我们提供了一种细微的特征特征,说明在自然条件方面解决单色单色变异性差异的超常方法的趋同率,我们称之为相对的利普施奇茨。我们进一步推广了这个框架,以处理局部和随机的相对利普西茨概念,从而根据加速(随机化)协调下降以顺畅的 convex 函数最小化而实现的区域凝固度和复杂性界限,恢复了受箱控制 $@ incenty$ inty$ 的回归率。