In this work, we combine the two paradigms: Federated Learning (FL) and Continual Learning (CL) for text classification task in cloud-edge continuum. The objective of Federated Continual Learning (FCL) is to improve deep learning models over life time at each client by (relevant and efficient) knowledge transfer without sharing data. Here, we address challenges in minimizing inter-client interference while knowledge sharing due to heterogeneous tasks across clients in FCL setup. In doing so, we propose a novel framework, Federated Selective Inter-client Transfer (FedSeIT) which selectively combines model parameters of foreign clients. To further maximize knowledge transfer, we assess domain overlap and select informative tasks from the sequence of historical tasks at each foreign client while preserving privacy. Evaluating against the baselines, we show improved performance, a gain of (average) 12.4\% in text classification over a sequence of tasks using five datasets from diverse domains. To the best of our knowledge, this is the first work that applies FCL to NLP.
翻译:在这项工作中,我们结合了两种模式:在云端连续体中文本分类任务方面,联邦学习(FL)和连续学习(CL)。联邦持续学习(FCL)的目标是通过(相关和高效)知识转让,在不共享数据的情况下,改善每个客户一生的深层次学习模式。这里,我们处理如何尽量减少客户之间的干扰,同时因FCL设置客户之间的任务不同而分享知识的挑战。在这样做时,我们提议了一个新框架:联邦选择性客户间转让(FedSeIT),有选择地将外国客户的模型参数结合起来。为了进一步最大限度地实现知识转让,我们评估域重叠,并从每个外国客户的历史任务顺序中选择信息性任务,同时保护隐私。根据基线评估,我们显示在使用不同领域的五个数据集对一系列任务进行文本分类(平均)12.4 ⁇ 的收益。我们最了解的是,这是将FLLP应用到NLP的第一个工作。