Federated learning has attracted increasing attention to building models without accessing the raw user data, especially in healthcare. In real applications, different federations can seldom work together due to possible reasons such as data heterogeneity and distrust/inexistence of the central server. In this paper, we propose a novel framework called MetaFed to facilitate trustworthy FL between different federations. MetaFed obtains a personalized model for each federation without a central server via the proposed Cyclic Knowledge Distillation. Specifically, MetaFed treats each federation as a meta distribution and aggregates knowledge of each federation in a cyclic manner. The training is split into two parts: common knowledge accumulation and personalization. Comprehensive experiments on three benchmarks demonstrate that MetaFed without a server achieves better accuracy compared to state-of-the-art methods (e.g., 10%+ accuracy improvement compared to the baseline for PAMAP2) with fewer communication costs.
翻译:在实际应用中,由于数据差异和不信任/中央服务器存在等可能的原因,不同联合会很少能够共同工作。在本文件中,我们提议了一个名为MetAFed的新框架,以促进不同联合会之间的可信赖FL。MetFed为每个联邦获得一个个性化模式,而没有通过拟议的Cyclic知识蒸馏系统建立中央服务器。具体地说,MetFed将每个联邦视为每个联邦的元分布和集成知识,以循环方式进行。培训分为两个部分:共同的知识积累和个人化。关于三个基准的全面实验表明,没有服务器的MetAFed比最先进的方法(例如,与PAMAP2的基线相比,精确度提高了10<unk> )。通信成本较低。</s>