Minimax problems have gained tremendous attentions across the optimization and machine learning community recently. In this paper, we introduce a new quasi-Newton method for minimax problems, which we call $J$-symmetric quasi-Newton method. The method is obtained by exploiting the $J$-symmetric structure of the second-order derivative of the objective function in minimax problem. We show that the Hessian estimation (as well as its inverse) can be updated by a rank-2 operation, and it turns out that the update rule is a natural generalization of the classic Powell symmetric Broyden (PSB) method from minimization problems to minimax problems. In theory, we show that our proposed quasi-Newton algorithm enjoys local Q-superlinear convergence to a desirable solution under standard regularity conditions. Furthermore, we introduce a trust-region variant of the algorithm that enjoys global R-superlinear convergence. Finally, we present numerical experiments that verify our theory and show the effectiveness of our proposed algorithms compared to Broyden's method and the extragradient method on three classes of minimax problems.
翻译:最近,小型问题在优化和机器学习界引起了极大关注。 在本文中,我们为小型问题引入了一种新的准纽顿方法,我们称之为美元对称准纽顿方法。该方法的获得途径是利用小麦问题中目标函数二阶衍生物第二阶衍生物的美元对称结构。我们显示,赫森估计(及其反向)可以通过一级-2行动更新,结果显示,更新规则是典型的鲍威尔对称布洛伊登(PSB)方法从最小化问题到小型麦克斯问题的一种自然概括化。理论上,我们表明,我们提议的准纽顿算法在标准常规条件下具有本地Q超线趋同性,到理想的解决方案。此外,我们引入了一个全球R-超线趋同的可信任区域算法变量。最后,我们提出了数字实验,以核实我们的理论,并展示了我们提议的算法与布洛伊登方法相比的有效性以及三个微轴问题等级的超前方法。