We consider stochastic unconstrained bilevel optimization problems when only the first-order gradient oracles are available. While numerous optimization methods have been proposed for tackling bilevel problems, existing methods either tend to require possibly expensive calculations regarding Hessians of lower-level objectives, or lack rigorous finite-time performance guarantees. In this work, we propose a Fully First-order Stochastic Approximation (F2SA) method, and study its non-asymptotic convergence properties. Specifically, we show that F2SA converges to an $\epsilon$-stationary solution of the bilevel problem after $\epsilon^{-7/2}, \epsilon^{-5/2}$, and $\epsilon^{-3/2}$ iterations (each iteration using $O(1)$ samples) when stochastic noises are in both level objectives, only in the upper-level objective, and not present (deterministic settings), respectively. We further show that if we employ momentum-assisted gradient estimators, the iteration complexities can be improved to $\epsilon^{-5/2}, \epsilon^{-4/2}$, and $\epsilon^{-3/2}$, respectively. We demonstrate even superior practical performance of the proposed method over existing second-order based approaches on MNIST data-hypercleaning experiments.
翻译:当只有第一级梯度或触角时,我们就会考虑无限制的随机不全的双级优化优化问题。虽然提出了许多处理双级问题的优化方法,但现有的方法往往要求对低级目标的赫西人进行可能昂贵的计算,或者缺乏严格的有限时间性能保障。在这项工作中,我们建议采用完全第一级的斯托克吸附法(F2SA),并研究其非非非非被动趋同性能的趋同特性。具体地说,我们表明F2SA在$\epsilon ⁇ -7/2}、\epsilon ⁇ -5/2}和$\epsilon_3/2}之后,会接近双级问题的固定价格解决方案。我们进一步表明,如果我们使用动力辅助梯度测算器,那么在以美元/-3 ⁇ /2 ⁇ /2}和美元反复重复值(eepseration)之后,就能够分别改进基于我们现有数据的高级性能方法。