Client selection schemes are widely adopted to handle the communication-efficient problems in recent studies of Federated Learning (FL). However, the large variance of the model updates aggregated from the randomly-selected unrepresentative subsets directly slows the FL convergence. We present a novel clustering-based client selection scheme to accelerate the FL convergence by variance reduction. Simple yet effective schemes are designed to improve the clustering effect and control the effect fluctuation, therefore, generating the client subset with certain representativeness of sampling. Theoretically, we demonstrate the improvement of the proposed scheme in variance reduction. We also present the tighter convergence guarantee of the proposed method thanks to the variance reduction. Experimental results confirm the exceed efficiency of our scheme compared to alternatives.
翻译:最近对联邦学习联合会(FL)的研究广泛采用客户选择办法,处理沟通效率问题,不过,随机选择的无代表性子集所汇总的模型更新差异很大,直接减缓了FL的趋同。我们提出了一个新的基于集群的客户选择办法,通过减少差异来加快FL的趋同速度。设计简单而有效的办法,目的是改善集群效应和控制影响波动,从而产生具有一定代表性的客户子集。理论上,我们显示了拟议减少差异办法的改进。我们还提出了由于差异减少而使拟议方法更加趋同的保证。实验结果证实,与替代办法相比,我们的计划效率超过了其他办法。