Privacy, security, and bandwidth constraints have led to federated learning (FL) in wireless systems, where training a machine learning (ML) model is accomplished collaboratively without sharing raw data. Often, such collaborative FL strategies necessitate model aggregation at a server. On the other hand, decentralized FL necessitates that participating clients reach a consensus ML model by exchanging parameter updates. In this work, we propose the over-the-air clustered wireless FL (CWFL) strategy, which eliminates the need for a strong central server and yet achieves an accuracy similar to the server-based strategy while using fewer channel uses as compared to decentralized FL. We theoretically show that the convergence rate of CWFL per cluster is O(1/T) while mitigating the impact of noise. Using the MNIST and CIFAR datasets, we demonstrate the accuracy performance of CWFL for the different number of clusters across communication rounds.
翻译:隐私、安全和带宽限制导致无线系统联合学习(FL),在无线系统中,在不共享原始数据的情况下合作培训机器学习(ML)模式,这种合作FL战略往往需要在服务器上进行模型汇总。另一方面,分散式FL要求参与客户通过交换参数更新实现协商一致ML模式。在这项工作中,我们提议了超空集群无线FL(CWFL)战略,它消除了对强大的中央服务器的需求,但实现了与服务器战略相似的准确性,同时使用比分散式FL更少的频道使用。 我们理论上表明,每个集群CWFL的合并率是O(1/T),同时减轻噪音的影响。我们利用MNIST和CIFAR数据集,展示了CWL对不同通信轮数组的准确性表现。