Federated Learning (FL) has emerged as a means of distributed learning using local data stored at clients with a coordinating server. Recent studies showed that FL can suffer from poor performance and slower convergence when training data at clients are not independent and identically distributed. Here we consider a new complementary approach to mitigating this performance degradation by allowing the server to perform auxiliary learning from a small dataset. Our analysis and experiments show that this new approach can achieve significant improvements in both model accuracy and convergence time even when the server dataset is small and its distribution differs from that of the aggregated data from all clients.
翻译:联邦学习(FL)已经成为一种通过利用存储在客户端本地的数据并与一个协调服务器进行学习的分布式学习方式。最近的研究表明,当客户端的训练数据不独立且分布不一致时,FL 可能会导致性能下降和收敛速度变慢。在本文中,我们考虑了一种新的补充方法,即允许服务器从一个小数据集进行辅助学习,以减轻性能下降的影响。我们的分析和实验结果表明,即使服务器数据集很小且其分布不同于所有客户端聚合数据的分布,这种新方法也可以显著提高模型的准确性和收敛时间。