Federated learning has shown its advances over the last few years but is facing many challenges, such as how algorithms save communication resources, how they reduce computational costs, and whether they converge. To address these issues, this paper proposes a new federated learning algorithm (FedGiA) that combines the gradient descent and the inexact alternating direction method of multipliers. It is shown that FedGiA is computation and communication-efficient and convergent linearly under mild conditions.
翻译:为了解决这些问题,本文提出了一个新的联邦学习算法(FedGiA ), 将梯度下降和不精确的乘数交替方向方法结合起来。 事实表明,FedGiA是在温和的条件下进行计算、通信效率和线性趋同的。