Federated learning (FL) can help promote data privacy by training a shared model in a de-centralized manner on the physical devices of clients. In the presence of highly heterogeneous distributions of local data, personalized FL strategy seeks to mitigate the potential client drift. In this paper, we present the group personalization approach for applications of FL in which there exist inherent partitions among clients that are significantly distinct. In our method, the global FL model is fine-tuned through another FL training process over each homogeneous group of clients, after which each group-specific FL model is further adapted and personalized for any client. The proposed method can be well interpreted from a Bayesian hierarchical modeling perspective. With experiments on two real-world datasets, we demonstrate this approach can achieve superior personalization performance than other FL counterparts.
翻译:联邦学习(FL)通过以分散方式对客户实物设备进行共同模式的培训,可以帮助促进数据隐私。在当地数据分布高度多样化的情况下,个性化FL战略力求减少潜在客户的漂移。在本文中,我们介绍FL应用的团体个性化方法,其中客户之间有明显不同的内在分割。在我们的方法中,全球FL模式通过针对每个同质客户群体的另一个FL培训过程进行微调,此后,每个特定群体FL模式都进一步调整,并针对任何客户进行个性化化。拟议方法可以从巴伊西亚等级模型的角度加以很好的解释。在两个真实世界数据集的实验中,我们证明这一方法能够实现优于其他FL的个性化业绩。