Non-Independent and Identically Distributed (non- IID) data distribution among clients is considered as the key factor that degrades the performance of federated learning (FL). Several approaches to handle non-IID data such as personalized FL and federated multi-task learning (FMTL) are of great interest to research communities. In this work, first, we formulate the FMTL problem using Laplacian regularization to explicitly leverage the relationships among the models of clients for multi-task learning. Then, we introduce a new view of the FMTL problem, which in the first time shows that the formulated FMTL problem can be used for conventional FL and personalized FL. We also propose two algorithms FedU and dFedU to solve the formulated FMTL problem in communication-centralized and decentralized schemes, respectively. Theoretically, we prove that the convergence rates of both algorithms achieve linear speedup for strongly convex and sublinear speedup of order 1/2 for nonconvex objectives. Experimentally, we show that our algorithms outperform the conventional algorithm FedAvg in FL settings, MOCHA in FMTL settings, as well as pFedMe and Per-FedAvg in personalized FL settings.
翻译:在这项工作中,首先,我们利用拉普拉西亚规范化制定FMTL问题,以明确利用客户模式之间的关系进行多任务学习。然后,我们引入了FMTL问题的新观点,首次显示FMTL问题可用于传统的FL和个性化FL。 我们还提出了两种算法,分别用于解决通信集中化和分散化计划中的FMTL问题。理论上,我们证明两种算法的趋同率在非 Convx 目标的高度对等和亚线性速度中达到了线性加速率。我们实验表明,在FMTAvg个人设置中,FMTFAvg和FFMAFL设置中,FMTFFFAv-FAVML设置中常规算法优于FMTAv-FML设置中FMFAv-FMAFAFAFAFAFML设置中,MSU 理论上地证明,在FMFFAFAFAFAFADL设置中,在FML设置中,在FFFFFFFFAFFFADL设置、ML设置中,在FMADFFFAFFFFFFFDML设置中,我们证明和MAFFFFFAFAFAFFFFAFAFAFA和ML设置中,在FAFAFAFA的精化的缩缩缩缩化中,我们。