Class-Incremental Learning is a challenging problem in machine learning that aims to extend previously trained neural networks with new classes. This is especially useful if the system is able to classify new objects despite the original training data being unavailable. While the semantic segmentation problem has received less attention than classification, it poses distinct problems and challenges since previous and future target classes can be unlabeled in the images of a single increment. In this case, the background, past and future classes are correlated and there exist a background-shift. In this paper, we address the problem of how to model unlabeled classes while avoiding spurious feature clustering of future uncorrelated classes. We propose to use Evidential Deep Learning to model the evidence of the classes as a Dirichlet distribution. Our method factorizes the problem into a separate foreground class probability, calculated by the expected value of the Dirichlet distribution, and an unknown class (background) probability corresponding to the uncertainty of the estimate. In our novel formulation, the background probability is implicitly modeled, avoiding the feature space clustering that comes from forcing the model to output a high background score for pixels that are not labeled as objects. Experiments on the incremental Pascal VOC, and ADE20k benchmarks show that our method is superior to state-of-the-art, especially when repeatedly learning new classes with increasing number of increments.
翻译:课堂强化学习是机器学习中的一个棘手问题,目的是扩大以前受过训练的神经网络,增加新班级。 如果系统能够对新对象进行分类,尽管原始培训数据不可用,这特别有用。 虽然语义分解问题没有受到分类的关注, 但它带来了不同的问题和挑战, 因为先前和未来的目标类可以在单一递增的图像中不贴标签。 在这种情况下, 背景、 过去和未来的类与估算的不确定性相关, 并且存在背景变换。 在本文中, 我们处理如何建模未加标签的班级, 同时避免未来与非气候有关的班级的虚假特征组合。 我们提议使用 " 深度学习 " 来模拟这些班级的证据, 将其作为Drichlet的分布。 我们的方法因素将问题分为一个单独的前类和今后目标类, 根据Drichlet分布的预期值来计算, 以及一个与估计的不确定性相对的未知类(背景)概率。 在我们的新公式中, 隐含了背景概率, 避免了从将模型输出到高级背景分级的高级背景分级,, 和高级分级的分级, 显示我们不断的分级。