Incremental learning represents a crucial task in aerial image processing, especially given the limited availability of large-scale annotated datasets. A major issue concerning current deep neural architectures is known as catastrophic forgetting, namely the inability to faithfully maintain past knowledge once a new set of data is provided for retraining. Over the years, several techniques have been proposed to mitigate this problem for image classification and object detection. However, only recently the focus has shifted towards more complex downstream tasks such as instance or semantic segmentation. Starting from incremental-class learning for semantic segmentation tasks, our goal is to adapt this strategy to the aerial domain, exploiting a peculiar feature that differentiates it from natural images, namely the orientation. In addition to the standard knowledge distillation approach, we propose a contrastive regularization, where any given input is compared with its augmented version (i.e. flipping and rotations) in order to minimize the difference between the segmentation features produced by both inputs. We show the effectiveness of our solution on the Potsdam dataset, outperforming the incremental baseline in every test. Code available at: https://github.com/edornd/contrastive-distillation.
翻译:递增学习是航空图像处理中的一项关键任务,特别是考虑到大规模附加说明数据集的有限可用性。关于当前深神经结构的一个主要问题被称为灾难性的遗忘,即一旦为再培训提供了一套新的数据,就无法忠实地保持过去的知识。多年来,为减轻图像分类和物体探测的这一问题提出了几种技术建议。不过,直到最近,重点才转向更复杂的下游任务,如实例或语义分割。从为语义分割任务增级学习开始,我们的目标是将这一战略适应航空领域,利用一个将其与自然图像区分的特殊特征,即方向。除了标准知识蒸馏方法外,我们建议一种对比性规范,将任何特定投入与其强化版本(即翻转和旋转)进行比较,以尽量减少两种投入产生的分解特性之间的差别。我们展示了我们在波茨坦数据集上的解决办法的有效性,超越了每一项测试的递增基线。可用的代码:https://github.com/edord/contrastristillation。