Bilevel Optimization has witnessed notable progress recently with new emerging efficient algorithms, yet it is underexplored in the Federated Learning setting. It is unclear how the challenges of Federated Learning affect the convergence of bilevel algorithms. In this work, we study Federated Bilevel Optimization problems. We first propose the FedBiO algorithm that solves the hyper-gradient estimation problem efficiently, then we propose FedBiOAcc to accelerate FedBiO. FedBiO has communication complexity $O(\epsilon^{-1.5})$ with linear speed up, while FedBiOAcc achieves communication complexity $O(\epsilon^{-1})$, sample complexity $O(\epsilon^{-1.5})$ and also the linear speed up. We also study Federated Bilevel Optimization problems with local lower level problems, and prove that FedBiO and FedBiOAcc converges at the same rate with some modification.
翻译:双级最佳化最近随着新的高效算法的出现而取得了显著进展,但在联邦学习环境中,这种算法的探索不足。目前还不清楚联邦学习的挑战如何影响双级算法的趋同。在这项工作中,我们研究了双级最佳化问题。我们首先提出了能有效解决高等级估计问题的美联储比奥克算法,然后我们建议美联储加速美联储比奥。美联储比奥的通信复杂性为O(epsilon ⁇ -1.5})美元,线性速度加快,而美联储比奥克公司达到通信复杂性$O(epsilon ⁇ )美元、样本复杂性$O(epsilon ⁇ -1.5})和线性速度加快。我们还研究了与地方较低级别问题有关的双级最佳化问题,并证明美联储和美联储以同样的速度汇合,并作了一些修改。