Bilevel optimization is one of the fundamental problems in machine learning and optimization. Recent theoretical developments in bilevel optimization focus on finding the first-order stationary points for nonconvex-strongly-convex cases. In this paper, we analyze algorithms that can escape saddle points in nonconvex-strongly-convex bilevel optimization. Specifically, we show that the perturbed approximate implicit differentiation (AID) with a warm start strategy finds $\epsilon$-approximate local minimum of bilevel optimization in $\tilde{O}(\epsilon^{-2})$ iterations with high probability. Moreover, we propose an inexact NEgative-curvature-Originated-from-Noise Algorithm (iNEON), a pure first-order algorithm that can escape saddle point and find local minimum of stochastic bilevel optimization. As a by-product, we provide the first nonasymptotic analysis of perturbed multi-step gradient descent ascent (GDmax) algorithm that converges to local minimax point for minimax problems.
翻译:双层优化是机器学习和优化的根本问题之一。 最近双层优化的理论发展侧重于为非电流强密凝固型案例找到第一阶固定点。 在本文中, 我们分析了在非电流强密凝固双层优化中能够逃离马鞍点的算法。 具体地说, 我们显示, 带有温暖启动战略的被扰动的近似隐含差异( AID) 在 $\ tilde{O} (\epsilon}-2} (\ epsilon- 2}) 中, 发现本地双层优化的近似最低值。 此外, 我们提出一种不完全的电流- 电流- 离氮化- 亚松立- 亚松 亚勒哥里特姆( iNeON) 算法, 这是一种纯粹的第一阶流算法, 可以逃离马鞍点, 并找到本地最小的双层优化 。 作为副产品, 我们提供了第一个非抽动多级梯位梯系血源( GDmax) 缩算法, 将迷你轴问题集中到本地点。