Federated Learning (FL) has become an active and promising distributed machine learning paradigm. As a result of statistical heterogeneity, recent studies clearly show that the performance of popular FL methods (e.g., FedAvg) deteriorates dramatically due to the client drift caused by local updates. This paper proposes a novel Federated Learning algorithm (called IGFL), which leverages both Individual and Group behaviors to mimic distribution, thereby improving the ability to deal with heterogeneity. Unlike existing FL methods, our IGFL can be applied to both client and server optimization. As a by-product, we propose a new attention-based federated learning in the server optimization of IGFL. To the best of our knowledge, this is the first time to incorporate attention mechanisms into federated optimization. We conduct extensive experiments and show that IGFL can significantly improve the performance of existing federated learning methods. Especially when the distributions of data among individuals are diverse, IGFL can improve the classification accuracy by about 13% compared with prior baselines.
翻译:联邦学习联合会(FL)已成为一个积极和有希望的分布式机器学习模式,由于统计差异性,最近的研究清楚地表明,由于本地更新导致客户流动,流行的FL方法(例如FedAvg)的性能急剧恶化。本文提出了一个新的Freed Learning 算法(称为IGFL),它利用个人和群体的行为来模仿分布,从而提高处理异质性的能力。与现有的FL方法不同,我们的IGFL可以同时适用于客户和服务器优化。作为一个副产品,我们提议在IGFL服务器优化中采用新的关注型联合学习。根据我们所知,这是第一次将关注机制纳入Federeral优化。我们进行了广泛的实验,并表明IGFL可以显著改善现有Federate学习方法的性能。特别是当个人之间数据分布多种多样时,IGFL可以比以前的基线提高分类准确性约13%。