Continual learning is an important problem for achieving human-level intelligence in real-world applications as an agent must continuously accumulate knowledge in response to streaming data/tasks. In this work, we consider a general and yet under-explored incremental learning problem in which both the class distribution and class-specific domain distribution change over time. In addition to the typical challenges in class incremental learning, this setting also faces the intra-class stability-plasticity dilemma and intra-class domain imbalance problems. To address above issues, we develop a novel domain-aware continual learning method based on the EM framework. Specifically, we introduce a flexible class representation based on the von Mises-Fisher mixture model to capture the intra-class structure, using an expansion-and-reduction strategy to dynamically increase the number of components according to the class complexity. Moreover, we design a bi-level balanced memory to cope with data imbalances within and across classes, which combines with a distillation loss to achieve better inter- and intra-class stability-plasticity trade-off. We conduct exhaustive experiments on three benchmarks: iDigits, iDomainNet and iCIFAR-20. The results show that our approach consistently outperforms previous methods by a significant margin, demonstrating its superiority.
翻译:在现实世界应用中,持续学习是实现人类层面智能的一个重要问题,因为代理机构必须不断积累知识,以应对不断流出的数据/任务。在这项工作中,我们考虑到一个普遍而但探索不足的渐进学习问题,即班级分配和班级特定领域分配随时间变化而变化。除了班级递增学习的典型挑战外,这种环境还面临阶级内部稳定-塑料难题和阶级内部失衡问题。为了解决上述问题,我们根据EM框架开发了一种新的域-认知持续学习方法。具体地说,我们采用了基于von Misses-Fisher混合模型的灵活班级代表性,以捕捉阶级内部结构,利用扩大和削减战略根据班级复杂程度动态地增加组成部分的数量。此外,我们设计了双级平衡记忆,以应对班级内部和跨班级的数据不平衡,这与蒸馏损失相结合,以更好地实现阶级间和内部稳定-塑料交易。我们用iDomainNet和iCIFAR方法对三个基准进行了详尽的实验。我们用IGigit、i-20FAR方法持续展示了我们以前的优越性方法。