We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs have a natural way to support a dynamically-growing number of tasks or classes that causes less interference with previously learned information. We find that EBMs outperform the baseline methods by a large margin on several continual learning benchmarks. We also show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a class of models naturally inclined towards the continual learning regime.
翻译:我们激励以能源为基础的模型(EBMs)作为持续学习问题的有希望的模型班级。 EBM不是通过使用外部记忆、增长模型或正规化来解决不断学习的问题,而是自然地支持数量不断增长的任务或班级,这些任务或班级对以往学到的信息造成较少干扰。我们发现EBMs在几个持续学习的基准上大大优于基线方法。我们还表明,EBMs适应于更普遍的不断学习环境,在这种环境中,数据分配的变化没有明确规定的任务的概念。这些观察指出,EBMs是自然倾向于持续学习制度的一组模式。