Federated learning allows clients to collaboratively learn statistical models while keeping their data local. Federated learning was originally used to train a unique global model to be served to all clients, but this approach might be sub-optimal when clients' local data distributions are heterogeneous. In order to tackle this limitation, recent personalized federated learning methods train a separate model for each client while still leveraging the knowledge available at other clients. In this work, we exploit the ability of deep neural networks to extract high quality vectorial representations (embeddings) from non-tabular data, e.g., images and text, to propose a personalization mechanism based on local memorization. Personalization is obtained interpolating a pre-trained global model with a $k$-nearest neighbors (kNN) model based on the shared representation provided by the global model. We provide generalization bounds for the proposed approach and we show on a suite of federated datasets that this approach achieves significantly higher accuracy and fairness than state-of-the-art methods.
翻译:联邦学习使客户能够合作学习统计模型,同时将其数据保存在本地。联邦学习最初用于培训一个独特的全球模型,供所有客户使用,但当客户的当地数据分布不一时,这一方法可能并不理想。为了应对这一限制,最近个人化的联邦学习方法为每个客户培训了一个单独的模型,同时仍在利用其他客户的现有知识。在这项工作中,我们利用深神经网络的能力,从非表层数据(例如图像和文本)中提取高质量的矢量表(组合),以提出一个基于地方记忆化的个人化机制。个人化是利用预先培训的全球模型,以全球模型提供的最近邻的共享代表模式(kNN)进行相互推算。我们为拟议方法提供了通用界限,我们展示了一套填充数据集,该方法的准确性和公平性远远高于现状方法。