Personalized federated learning is aimed at allowing numerous clients to train personalized models while participating in collaborative training in a communication-efficient manner without exchanging private data. However, many personalized federated learning algorithms assume that clients have the same neural network architecture, and those for heterogeneous models remain understudied. In this study, we propose a novel personalized federated learning method called federated classifier averaging (FedClassAvg). Deep neural networks for supervised learning tasks consist of feature extractor and classifier layers. FedClassAvg aggregates classifier weights as an agreement on decision boundaries on feature spaces so that clients with not independently and identically distributed (non-iid) data can learn about scarce labels. In addition, local feature representation learning is applied to stabilize the decision boundaries and improve the local feature extraction capabilities for clients. While the existing methods require the collection of auxiliary data or model weights to generate a counterpart, FedClassAvg only requires clients to communicate with a couple of fully connected layers, which is highly communication-efficient. Moreover, FedClassAvg does not require extra optimization problems such as knowledge transfer, which requires intensive computation overhead. We evaluated FedClassAvg through extensive experiments and demonstrated it outperforms the current state-of-the-art algorithms on heterogeneous personalized federated learning tasks.
翻译:个人化联谊学习旨在让许多客户在不交换私人数据的情况下,在以高效通信方式参与合作培训的同时,培训个性化模式,而无需交换私人数据;然而,许多个性化联谊学习算法假设客户拥有相同的神经网络结构,而异体型模型的学习结构仍然受到忽视;在本研究中,我们建议采用一种新的个性化联式学习方法,称为“联式分类平均”(FedClassAvg);监督学习任务的深神经网络由特征提取和分类层组成;FedClassAvg综合分类器重量,作为关于地物空间决定界限的协议,使不独立和同样分布(非二)数据的客户能够了解稀缺的标签;此外,本地特征代表学习用于稳定决定界限,提高客户的本地特征提取能力;尽管现有方法要求收集辅助数据或模型重量,以产生对应数据,FedClassAvg只要求客户与几个完全连接的层进行沟通,这是高度高效的沟通。此外,FedClassAvg的分类权重权重,因此,需要通过不断进行超额化的个人化的移动式计算机化,我们需要通过Suplodial-cal-Adal-Adro化来进行深入的计算。