Continually learning new skills is important for intelligent systems, yet standard deep learning methods suffer from catastrophic forgetting of the past. Recent works address this with weight regularisation. Functional regularisation, although computationally expensive, is expected to perform better, but rarely does so in practice. In this paper, we fix this issue by using a new functional-regularisation approach that utilises a few memorable past examples crucial to avoid forgetting. By using a Gaussian Process formulation of deep networks, our approach enables training in weight-space while identifying both the memorable past and a functional prior. Our method achieves state-of-the-art performance on standard benchmarks and opens a new direction for life-long learning where regularisation and memory-based methods are naturally combined.
翻译:持续学习新技能对于智能系统很重要,但标准的深层次学习方法却会因为灾难性地忘记过去而受到影响。最近的工作用权重规范化来解决这个问题。功能正规化虽然计算成本很高,但预期效果会更好,但在实践中却很少如此。在本文件中,我们通过采用一种新的功能正规化方法来解决这个问题,这种方法利用了几个难忘的过去重要例子来避免忘记。通过使用高斯进程深层网络的公式,我们的方法既能进行重量空间培训,又能识别可难忘的过去,又能发挥功能。我们的方法在标准基准上取得了最先进的业绩,并为终身学习开辟了新的方向,其中经常化和记忆方法自然地结合在一起。