We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs change the underlying training objective to cause less interference with previously learned information. Our proposed version of EBMs for continual learning is simple, efficient, and outperforms baseline methods by a large margin on several benchmarks. Moreover, our proposed contrastive divergence-based training objective can be combined with other continual learning methods, resulting in substantial boosts in their performance. We further show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a useful building block for future continual learning methods.
翻译:我们鼓励以能源为基础的模式(EBMs)作为持续学习问题的有希望的模范班。 EBM不是通过利用外部记忆、增长模式或正规化解决不断学习的问题,而是改变基本培训目标,减少对以往所学信息的干扰。我们提议的以持续学习为基础的EBMs 版本简单、高效,在几个基准上大大优于基线方法。此外,我们提议的差异化培训目标可以与其他持续学习方法相结合,从而大大提升它们的绩效。我们进一步表明,EBMs适应于更普遍的不断学习环境,使数据分配变化而没有明确规定任务的概念。这些观察指出,EBMs是未来持续学习方法的有用基石。