In federated learning, clients share a global model that has been trained on decentralized local client data. Although federated learning shows significant promise as a key approach when data cannot be shared or centralized, current methods show limited privacy properties and have shortcomings when applied to common real-world scenarios, especially when client data is heterogeneous. In this paper, we propose an alternative method to learn a personalized model for each client in a federated setting, with greater generalization abilities than previous methods. To achieve this personalization we propose a federated learning framework using a mixture of experts to combine the specialist nature of a locally trained model with the generalist knowledge of a global model. We evaluate our method on a variety of datasets with different levels of data heterogeneity, and our results show that the mixture of experts model is better suited as a personalized model for devices in these settings, outperforming both fine-tuned global models and local specialists.
翻译:在联合学习中,客户共享一个全球模式,在分散的地方客户数据方面受过培训。尽管联合学习显示在数据不能共享或集中的情况下作为关键方法有很大的希望,但目前的方法显示隐私特性有限,在适用于共同的现实世界情景时有缺陷,特别是在客户数据多种多样的情况下。在本文中,我们提出了在联合环境下为每个客户学习个性化模式的替代方法,这种模式比以往的方法更具有概括性。为了实现这种个性化,我们提议了一个联合学习框架,利用专家的混合方式,将当地培训的模式的专家性质与全球模型的通才知识结合起来。我们评估了不同数据差异程度的各种数据集的方法,我们的结果显示,专家模式的混合更适合作为这些环境中装置的个性化模式,比精细调整的全球模型和当地专家都好。