Applying knowledge distillation to personalized cross-silo federated learning can well alleviate the problem of user heterogeneity. This approach, however, requires a proxy dataset, which is difficult to obtain in the real world. Moreover, the global model based on parameter averaging will lead to the leakage of user privacy. We introduce a distributed three-player GAN to implement datafree co-distillation between clients. This technique mitigates the user heterogeneity problem and better protects user privacy. We confirmed that thefake samples generated by GAN can make federated distillation more efficient and robust, and the co-distillation can achieve good performance for individual clients on the basis of obtaining global knowledge. Our extensive experiments on benchmark datasets demonstrate the superior generalization performance of the proposed methods, compared with the state-of-the-art.
翻译:将知识蒸馏应用到个性化的跨筒式联谊学习中,可以很好地缓解用户异质性问题。然而,这种方法需要一套在现实世界中难以获得的代用数据集。此外,基于平均参数的全球模型将导致用户隐私的泄露。我们引入了分布式的三位玩家GAN来实施客户之间无数据共蒸馏。这一技术可以缓解用户异质问题,更好地保护用户隐私。我们确认,GAN产生的假样品可以提高联结蒸馏的效率和稳健性,而共同蒸馏可以在获得全球知识的基础上为单个客户带来良好的业绩。我们在基准数据集上的广泛实验表明,与最新技术相比,拟议方法的超常化性表现。