Federated learning (FL) is a hot collaborative training framework via aggregating model parameters of decentralized local clients. However, most existing models unreasonably assume that data categories of FL framework are known and fxed in advance. It renders the global model to signifcantly degrade recognition performance on old categories (i.e., catastrophic forgetting), when local clients receive new categories consecutively under limited memory of storing old categories. Moreover, some new local clients that collect novel categories unseen by other clients may be introduced to the FL training irregularly, which further exacerbates the catastrophic forgetting on old categories. To tackle the above issues, we propose a novel Local-Global Anti-forgetting (LGA) model to address local and global catastrophic forgetting on old categories, which is a pioneering work to explore a global class-incremental model in the FL feld. Specifcally, considering tackling class imbalance of local client to surmount local forgetting, we develop a category-balanced gradient-adaptive compensation loss and a category gradient-induced semantic distillation loss. They can balance heterogeneous forgetting speeds of hard-to-forget and easy-to-forget old categories, while ensure intrinsic class relations consistency within different incremental tasks. Moreover, a proxy server is designed to tackle global forgetting caused by Non-IID class imbalance between different clients. It collects perturbed prototype images of new categories from local clients via prototype gradient communication under privacy preservation, and augments them via self-supervised prototype augmentation to choose the best old global model and improve local distillation gain. Experiments on representative datasets verify superior performance of our model against other comparison methods.
翻译:联邦学习(FL)是一个热合作培训框架,通过汇总分散的当地客户的模型参数进行整合;然而,大多数现有模型不合理地假定FL框架的数据类别是已知的,并提前被粉碎;它使得全球模型能够标志旧类别(即灾难性的遗忘)的承认业绩,当当地客户在存储旧类别记忆有限的情况下连续获得新的类别时,在存储旧类别的记忆中连续获得新的类别;此外,有些收集其他客户所看不见的新类别的地方客户可能会被不定期引入FL培训,这进一步加剧了老类别上层的灾难性记忆。为了解决上述问题,我们建议了一个全新的本地-全球反加速框架数据类别(LGA)模型(LGA)模型,以解决旧类别上本地和全球灾难性的遗忘。这是探索FLFeld全球类(即灾难性的遗忘)中全球类中全球级级级-级-级-级级的认知业绩表现的开拓性能。我们开发了一种分类平衡的梯度-适应性补偿损失,以及一种由梯度引发的本地级-级级的递增变变变变变变的自我印象损失。它们可以平衡地在硬-级-级-级-级的递增-级-级的自我变换-级-级的自我变换-级的自我变换-级-级-级-级-级-级-级-级-级-级的自我变换-级-级-级-级关系导致-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级-级