Due to the curse of statistical heterogeneity across clients, adopting a personalized federated learning method has become an essential choice for the successful deployment of federated learning-based services. Among diverse branches of personalization techniques, a model mixture-based personalization method is preferred as each client has their own personalized model as a result of federated learning. It usually requires a local model and a federated model, but this approach is either limited to partial parameter exchange or requires additional local updates, each of which is helpless to novel clients and burdensome to the client's computational capacity. As the existence of a connected subspace containing diverse low-loss solutions between two or more independent deep networks has been discovered, we combined this interesting property with the model mixture-based personalized federated learning method for improved performance of personalization. We proposed SuPerFed, a personalized federated learning method that induces an explicit connection between the optima of the local and the federated model in weight space for boosting each other. Through extensive experiments on several benchmark datasets, we demonstrated that our method achieves consistent gains in both personalization performance and robustness to problematic scenarios possible in realistic services.
翻译:由于客户之间统计差异的诅咒,采用个性化联合学习方法已成为成功部署联合学习服务的关键选择。在个人化技术的不同分支中,采用基于混合的个性化方法比较可取,因为每个客户都因联合会学习而有自己的个性化模式。通常需要一种本地模式和联合会模式,但这种方法要么局限于部分参数交换,要么需要额外的本地更新,每个都对新客户无助,对客户的计算能力造成负担。随着发现存在一个包含两个或两个以上独立深层网络之间各种低损失解决方案的连接子空间,我们把这一有趣的属性与基于混合的示范个人化联合学习方法结合起来,以改进个人化的绩效。我们建议采用一种个性化的联邦学习方法,使本地和联邦模式在重量空间的优化性能与增强彼此之间产生明确的联系。通过对几个基准数据集的广泛实验,我们证明我们的方法在现实化业绩和稳健的假设中都取得了一致的成果。