We study the federated optimization problem from a dual perspective and propose a new algorithm termed federated dual coordinate descent (FedDCD), which is based on a type of coordinate descent method developed by Necora et al.[Journal of Optimization Theory and Applications, 2017]. Additionally, we enhance the FedDCD method with inexact gradient oracles and Nesterov's acceleration. We demonstrate theoretically that our proposed approach achieves better convergence rates than the state-of-the-art primal federated optimization algorithms under certain situations. Numerical experiments on real-world datasets support our analysis.
翻译:我们从双重角度研究联合优化问题,并提出一个新的算法,称为联合双协调血统(FedDCD),该算法以Necora等人[《优化理论和应用杂志,2017年]制定的一种协调血统方法为基础。此外,我们用不精确的梯度或骨骼和Nesterov的加速度来强化FedDCD方法。我们理论上表明,我们提出的方法在某些情况下比最先进的初配优化算法实现更好的趋同率。关于现实世界数据集的数字实验支持我们的分析。