We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs have a natural way to support a dynamically-growing number of tasks or classes that causes less interference with previously learned information. Our proposed version of EBMs for continual learning is simple, efficient and outperforms baseline methods by a large margin on several benchmarks. Moreover, our proposed contrastive divergence based training objective can be applied to other continual learning methods, resulting in substantial boosts in their performance. We also show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a class of models naturally inclined towards the continual learning regime.
翻译:我们鼓励以能源为基础的模式(EBMs)作为持续学习问题的有希望的模范班。 EBM没有通过利用外部记忆、增长模式或正规化解决不断学习的问题,而是有自然的方式支持数量不断增长的任务或班级,对以往学到的信息造成较少干扰。我们提出的持续学习的EBM模式版本简单、高效,在几个基准上大大优于基线方法。此外,我们提出的以差异为基础的培训目标可以适用于其他持续学习方法,从而大大提升其绩效。我们还表明,EBMs适应于一个更普遍的不断学习环境,即数据分配在没有明确规定的任务概念的情况下发生变化。这些观察指出,EBMs是自然倾向于持续学习机制的一组模式。