Non-IID data present a tough challenge for federated learning. In this paper, we explore a novel idea of facilitating pairwise collaborations between clients with similar data. We propose FedAMP, a new method employing federated attentive message passing to facilitate similar clients to collaborate more. We establish the convergence of FedAMP for both convex and non-convex models, and propose a heuristic method to further improve the performance of FedAMP when clients adopt deep neural networks as personalized models. Our extensive experiments on benchmark data sets demonstrate the superior performance of the proposed methods.
翻译:非IID数据对联合会学习提出了艰巨的挑战。 在本文中,我们探索了促进具有类似数据的客户之间双向合作的新理念。我们提出了FedAMP,这是一个采用联合会专注信息传递的新方法,用于帮助类似的客户进行更多的合作。我们建立了FedAMP对康韦克斯和非康韦克斯模式的趋同,并提出了在客户采用深层神经网络作为个性化模型时进一步改进FedAMP的功能的杂乱方法。我们在基准数据集方面的广泛实验显示了拟议方法的优异性。