Federated learning is a distributed learning that allows each client to keep the original data locally and only upload the parameters of the local model to the server. Despite federated learning can address data island, it remains challenging to train with data heterogeneous in a real application. In this paper, we propose FedSiam-DA, a novel dual-aggregated contrastive federated learning approach, to personalize both local and global models, under various settings of data heterogeneity. Firstly, based on the idea of contrastive learning in the siamese network, FedSiam-DA regards the local and global model as different branches of the siamese network during the local training and controls the update direction of the model by constantly changing model similarity to personalize the local model. Secondly, FedSiam-DA introduces dynamic weights based on model similarity for each local model and exercises the dual-aggregated mechanism to further improve the generalization of the global model. Moreover, we provide extensive experiments on benchmark datasets, the results demonstrate that FedSiam-DA achieves outperforming several previous FL approaches on heterogeneous datasets.
翻译:联邦学习是一种分布式学习,使每个客户都能在当地保留原始数据,并且只将当地模型的参数上传到服务器。尽管联邦学习可以解决数据岛问题,但在实际应用中培训数据差异化仍然是一项挑战。在本文中,我们提议FedSiam-DA,这是一个新的双集式对比式联邦学习方法,在数据差异性的不同环境中,将地方和全球模型个性化。首先,基于Siamese网络的对比性学习理念,FedSiam-DA将地方和全球模型视为Siamese网络的不同分支,在当地培训期间,通过不断改变模型的相似性使当地模型个性化来控制该模型的更新方向。第二,FedSiam-DA根据每个地方模型的相似性,采用动态加权法,并采用双集式机制来进一步改进全球模型的通用性。此外,我们提供关于基准数据集的广泛实验,结果显示,FedSiam-DADA取得了比以前几个不同数据集的FL方法更出色。