Federated learning (FL) enables multiple clients to train models collaboratively without sharing local data, which has achieved promising results in different areas, including the Internet of Things (IoT). However, end IoT devices do not have abilities to automatically annotate their collected data, which leads to the label shortage issue at the client side. To collaboratively train an FL model, we can only use a small number of labeled data stored on the server. This is a new yet practical scenario in federated learning, i.e., labels-at-server semi-supervised federated learning (SemiFL). Although several SemiFL approaches have been proposed recently, none of them can focus on the personalization issue in their model design. IoT environments make SemiFL more challenging, as we need to take device computational constraints and communication cost into consideration simultaneously. To tackle these new challenges together, we propose a novel SemiFL framework named pFedKnow. pFedKnow generates lightweight personalized client models via neural network pruning techniques to reduce communication cost. Moreover, it incorporates pretrained large models as prior knowledge to guide the aggregation of personalized client models and further enhance the framework performance. Experiment results on both image and text datasets show that the proposed pFedKnow outperforms state-of-the-art baselines as well as reducing considerable communication cost. The source code of the proposed pFedKnow is available at https://github.com/JackqqWang/pfedknow/tree/master.
翻译:联邦学习(FL) 使多个客户能够在不共享本地数据的情况下合作培训模型,而无需共享本地数据,这在包括物联网(IoT)在内的不同领域都取得了有希望的成果。 然而,终端 IoT 设备没有能力自动说明所收集的数据,这导致客户一方的标签短缺问题。要合作培训FL模型,我们只能使用服务器上储存的少量贴标签数据。这是在Federate 学习中的一种新的实际情景,即标签-at-server-silver 半监督的联邦化学习(SemiFL)取得了令人乐观的成果。虽然最近提出了若干Semfll 方法,但其中没有一个能够将个人化问题作为模型设计中的焦点。 IoT 环境使半FlFL更具有挑战性,因为我们需要同时考虑计算限制和通信成本问题。为了共同应对这些新的挑战,我们提议了一个名为 peflFedKalorest的框架。 ped knal knormational com com com model commations to nightal commissional commissional commissional commission to commissional commissional preport to fal prepress commess commess coluding commational commational commationalse.</s>