Federated learning has shown its advances over the last few years but is facing many challenges, such as how algorithms save communication resources, how they reduce computational costs, and whether they converge. To address these issues, this paper proposes exact and inexact ADMM-based federated learning. They are not only communication-efficient but also converge linearly under very mild conditions, such as convexity-free and irrelevance to data distributions. Moreover, the inexact version has low computational complexity, thereby alleviating the computational burdens significantly.
翻译:联邦学习展示了过去几年的进步,但正面临许多挑战,例如算法如何节省通信资源,如何降低计算成本,以及它们是否汇合。为了解决这些问题,本文件提出了准确和不精确的ADMM 联合学习。它们不仅具有通信效率,而且线性地在非常温和的条件下汇合,例如无连接和与数据分布无关。此外,不精确的版本的计算复杂性较低,从而大大减轻了计算负担。