Online Class Incremental learning (CIL) is a challenging setting in Continual Learning (CL), wherein data of new tasks arrive in incoming streams and online learning models need to handle incoming data streams without revisiting previous ones. Existing works used a single centroid adapted with incoming data streams to characterize a class. This approach possibly exposes limitations when the incoming data stream of a class is naturally multimodal. To address this issue, in this work, we first propose an online mixture model learning approach based on nice properties of the mature optimal transport theory (OT-MM). Specifically, the centroids and covariance matrices of the mixture model are adapted incrementally according to incoming data streams. The advantages are two-fold: (i) we can characterize more accurately complex data streams and (ii) by using centroids for each class produced by OT-MM, we can estimate the similarity of an unseen example to each class more reasonably when doing inference. Moreover, to combat the catastrophic forgetting in the CIL scenario, we further propose Dynamic Preservation. Particularly, after performing the dynamic preservation technique across data streams, the latent representations of the classes in the old and new tasks become more condensed themselves and more separate from each other. Together with a contraction feature extractor, this technique facilitates the model in mitigating the catastrophic forgetting. The experimental results on real-world datasets show that our proposed method can significantly outperform the current state-of-the-art baselines.
翻译:在线类递增学习(CIL)是持续学习(CL)中一个具有挑战性的环境,在这种环境中,新任务的数据在流入的流中到达,而在线学习模式需要处理进取的数据流,而无需重新审视以前的流。现有的工程使用一个与进取的数据流相适应的单一的中央机器人来描述一个阶级。当一个阶级的进取数据流是自然的多式联运时,这种方法可能暴露出局限性。为了解决这一问题,在这项工作中,我们首先提议基于成熟的最佳运输理论(OT-MM)的良好特性的在线混合模型学习方法。具体地说,混合模型的中位元体和共变体矩阵需要根据进取数据流来逐步调整。其优点有两个方面:(一)我们可以用一个更精确的复杂数据流来描述进取的数据流,以及(二)通过对由OT-MMM生成的每个阶级的进取取数据流使用非中央机器人,我们可以更合理地估计出与每个阶级相近的隐形示例。此外,为了消除在CIL(O-M-M-M)情景中,我们提出的模型和新任务流流中最接近的收缩方法可以大大地展示。