Federated learning is a distributed learning that allows each client to keep the original data locally and only upload the parameters of the local model to the server. Despite federated learning can address data island, it remains challenging to train with data heterogeneous in a real application. In this paper, we propose FedSiam-DA, a novel dual-aggregated contrastive federated learning approach, to personalize both local and global models, under various settings of data heterogeneity. Firstly, based on the idea of contrastive learning in the Siamese Network, FedSiam-DA regards the local and global model as different branches of the Siamese Network during the local training and controls the update direction of the model by constantly changing model similarity to personalize the local model. Secondly, FedSiam-DA introduces dynamic weights based on model similarity for each local model and exercises the dual-aggregated mechanism to further improve the generalization of the global model. Moreover, we provide extensive experiments on benchmark datasets, the results demonstrate that FedSiam-DA achieves outperforming several previous FL approaches on heterogeneous datasets.
翻译:联邦学习是一种分布式学习,使每个客户都能在当地保留原始数据,并且只将当地模式的参数上传到服务器。尽管联邦学习可以解决数据岛问题,但在实际应用中培训数据差异化仍然是一项挑战。在本文中,我们提议FedSiam-DA,这是一个新的双集式对比式联邦学习方法,在数据差异性的不同环境中,将地方和全球模式个性化。首先,根据西亚网络的对比性学习理念,FedSiam-DA在当地培训期间将当地和全球模式视为西亚网络的不同分支,通过不断改变相似模式,使当地模式个性化,控制该模式的更新方向。第二,FedSiam-DA根据模型的相似性,采用动态权重,并运用双重集机制,进一步改进全球模式的总体性。此外,我们提供了关于基准数据集的广泛实验,结果显示美国联邦Siam-DAA在混凝数据组方面比以往几个FL方法。