We improve the understanding of the $\textit{golden ratio algorithm}$, which solves monotone variational inequalities (VI) and convex-concave min-max problems via the distinctive feature of adapting the step sizes to the local Lipschitz constants. Adaptive step sizes not only eliminate the need to pick hyperparameters, but they also remove the necessity of global Lipschitz continuity and can increase from one iteration to the next. We first establish the equivalence of this algorithm with popular VI methods such as reflected gradient, Popov or optimistic gradient descent-ascent in the unconstrained case with constant step sizes. We then move on to the constrained setting and introduce a new analysis that allows to use larger step sizes, to complete the bridge between the golden ratio algorithm and the existing algorithms in the literature. Doing so, we actually eliminate the link between the golden ratio $\frac{1+\sqrt{5}}{2}$ and the algorithm. Moreover, we improve the adaptive version of the algorithm, first by removing the maximum step size hyperparameter (an artifact from the analysis) to improve the complexity bound, and second by adjusting it to nonmonotone problems with weak Minty solutions, with superior empirical performance.
翻译:我们提高了对 $\ textit{ golden 比例算法 $的理解, 解决单调变异性不平等( VI) 和软骨软软软软软软硬硬的微轴问题, 其特点是使步步数适应本地的 Lipschitz 常量。 适应性步数的大小不仅消除了选择超参数的需要, 而且还消除了全球利普西茨连续性的必要性, 并且可以从一个迭代到另一个。 我们首先将这一算法与流行的六种方法, 如反映梯度、 Popov 或乐观梯度梯度下降率等等等等等等等等等等等等等等等等等同。 我们接着进入一个限制的设置, 并引入一个新的分析, 允许使用更大的步数大小, 完成黄金比率算法与文献中现有算法之间的连接。 这样, 我们实际上消除了金比 $\ { 1 { { { { { { { { { qrt} { } { 2} 和算法之间的联系。 此外, 我们改进了算法的适应性版本, 首先通过消除最大步数的超缩缩, 和微的解决方案, 。