Federated learning enables multiple parties to collaboratively train a machine learning model without communicating their local data. A key challenge in federated learning is to handle the heterogeneity of local data distribution across parties. Although many studies have been proposed to address this challenge, we find that they fail to achieve high performance in image datasets with deep learning models. In this paper, we propose MOON: model-contrastive federated learning. MOON is a simple and effective federated learning framework. The key idea of MOON is to utilize the similarity between model representations to correct the local training of individual parties, i.e., conducting contrastive learning in model-level. Our extensive experiments show that MOON significantly outperforms the other state-of-the-art federated learning algorithms on various image classification tasks.
翻译:联邦学习使多方能够合作训练机器学习模式而无需交流其当地数据。联邦学习的一个关键挑战是处理不同政党之间当地数据分布的异质性。虽然已经提出了许多研究来应对这一挑战,但我们发现,它们未能在具有深层学习模式的图像数据集中取得高性能。在本论文中,我们建议MOON:模式-互动联合学习。MOON是一个简单而有效的联邦学习框架。MOON的关键思想是利用模型代表的相似性来纠正个别政党的当地培训,即在模型一级进行对比式学习。我们的广泛实验显示,MOON在各种图像分类任务上大大超越了其他最先进的联邦学习算法。