Bilevel optimization has arisen as a powerful tool for many machine learning problems such as meta-learning, hyperparameter optimization, and reinforcement learning. In this paper, we investigate the nonconvex-strongly-convex bilevel optimization problem. For deterministic bilevel optimization, we provide a comprehensive finite-time convergence analysis for two popular algorithms respectively based on approximate implicit differentiation (AID) and iterative differentiation (ITD). For the AID-based method, we orderwisely improve the previous finite-time convergence analysis due to a more practical parameter selection as well as a warm start strategy, and for the ITD-based method we establish the first theoretical convergence rate. Our analysis also provides a quantitative comparison between ITD and AID based approaches. For stochastic bilevel optimization, we propose a novel algorithm named stocBiO, which features a sample-efficient hypergradient estimator using efficient Jacobian- and Hessian-vector product computations. We provide the finite-time convergence guarantee for stocBiO, and show that stocBiO outperforms the best known computational complexities orderwisely with respect to the condition number $\kappa$ and the target accuracy $\epsilon$. We further validate our theoretical results and demonstrate the efficiency of bilevel optimization algorithms by the experiments on meta-learning and hyperparameter optimization. Our code for stocBiO and its comparison to other algorithms is available online$^1$.
翻译:对许多机器学习问题,例如元学习、超参数优化和强化学习,产生了一个强大的工具,即双级优化。在本文中,我们调查了非convex强混凝固双级优化问题。对于确定性双级优化,我们根据大致隐含差异(AID)和迭代差异(ITD),分别为两种流行的算法提供了全面的有限时间趋同分析。对于基于AID的方法,我们有条不紊地改进了先前的有限时间趋同分析,因为选择参数更加实用,并制定了一个温暖的启动战略,对于基于 ITD 的方法,我们建立了第一个理论趋同率。我们的分析还提供了一个基于 ITD 和 AID 的双级优化方法之间的定量比较。对于双级优化,我们提出了名为 STocBIO 的新的新算法,它使用高效的采样性估测算器,使用高效的 Jacobian 和 Hesian- Victor 产品计算方法。我们为Stocore-BIO提供了固定时间趋同时间趋同时间趋同保证,并且显示数字比美元的最佳计算方法比值水平,并显示我们最已知的计算和最精确性标准。