Today data is often scattered among billions of resource-constrained edge devices with security and privacy constraints. Federated Learning (FL) has emerged as a viable solution to learn a global model while keeping data private, but the model complexity of FL is impeded by the computation resources of edge nodes. In this work, we investigate a novel paradigm to take advantage of a powerful server model to break through model capacity in FL. By selectively learning from multiple teacher clients and itself, a server model develops in-depth knowledge and transfers its knowledge back to clients in return to boost their respective performance. Our proposed framework achieves superior performance on both server and client models and provides several advantages in a unified framework, including flexibility for heterogeneous client architectures, robustness to poisoning attacks, and communication efficiency between clients and server on various image classification tasks.
翻译:今天的数据往往分散在数十亿个资源受限制的边缘装置中,并带有安全和隐私限制; 联邦学习(FL)已成为在保持数据私密的同时学习全球模型的可行解决办法,但FL的模式复杂性受到边缘节点计算资源的阻碍; 在这项工作中,我们调查了一种新的模式,利用强大的服务器模型打破FL的模型能力。 通过有选择地向多个教师客户和自身学习,服务器模型发展了深入的知识,并将其知识传授给客户,从而提升他们各自的业绩; 我们的拟议框架在服务器和客户模式上都取得了优异的绩效,并在一个统一的框架中提供了若干优势,包括不同客户结构的灵活性、毒害袭击的稳健性以及客户和服务器之间在各种图像分类任务上的通信效率。