We consider the problem of personalized federated learning when there are known cluster structures within users. An intuitive approach would be to regularize the parameters so that users in the same cluster share similar model weights. The distances between the clusters can then be regularized to reflect the similarity between different clusters of users. We develop an algorithm that allows each cluster to communicate independently and derive the convergence results. We study a hierarchical linear model to theoretically demonstrate that our approach outperforms agents learning independently and agents learning a single shared weight. Finally, we demonstrate the advantages of our approach using both simulated and real-world data.
翻译:当用户内部有已知的集群结构时,我们考虑个性化联合学习的问题。一种直观的方法是规范参数,使同一集群的用户共享类似的模型权重。然后,可以对集群之间的距离进行常规化,以反映不同用户群之间的相似性。我们开发一种算法,允许每个集群独立交流并得出趋同结果。我们研究一个等级线性模型,从理论上证明我们的方法优于代理人独立学习,代理商学习单一的权重。最后,我们用模拟数据和现实世界数据来展示我们的方法的优点。