Federated learning enables multiple parties to collaboratively learn a model without exchanging their data. While most existing federated learning algorithms need many rounds to converge, one-shot federated learning (i.e., federated learning with a single communication round) is a promising approach to make federated learning applicable in cross-silo setting in practice. However, existing one-shot algorithms only support specific models and do not provide any privacy guarantees, which significantly limit the applications in practice. In this paper, we propose a practical one-shot federated learning algorithm named FedKT. By utilizing the knowledge transfer technique, FedKT can be applied to any classification models and can flexibly achieve differential privacy guarantees. Our experiments on various tasks show that FedKT can significantly outperform the other state-of-the-art federated learning algorithms with a single communication round.
翻译:联邦学习使多个当事方能够合作学习模型而无需交换数据。虽然大多数现有的联邦学习算法需要许多回合才能汇合,但一次性联合学习(即以单轮通信方式进行联合学习)是一种很有希望的做法,可以使联邦学习在跨筒仓环境中实际应用。然而,现有的一次性算法只支持特定模型,不提供任何隐私保障,从而大大限制了实际应用。在本文中,我们提出一个名为FedKT的实用的一次性联合学习算法。通过使用知识转移技术,FedKT可以应用到任何分类模型中,并灵活地实现差异隐私保障。我们在各种任务上的实验表明,FDKT可以大大超过其他最先进的联邦学习算法,而采用一个单一的通信回合。