Neural networks are susceptible to catastrophic forgetting. They fail to preserve previously acquired knowledge when adapting to new tasks. Inspired by human associative memory system, we propose a brain-like approach that imitates the associative learning process to achieve continual learning. We design a heuristics mechanism to potentiatively stimulates the model, which guides the model to recall the historical episodes based on the current circumstance and obtained association experience. Besides, a distillation measure is added to depressively alter the efficacy of synaptic transmission, which dampens the feature reconstruction learning for new task. The framework is mediated by potentiation and depression stimulation that play opposing roles in directing synaptic and behavioral plasticity. It requires no access to the original data and is more similar to human cognitive process. Experiments demonstrate the effectiveness of our method in alleviating catastrophic forgetting on continual image reconstruction problems.
翻译:神经网络很容易被灾难性的遗忘。 它们无法在适应新任务时保存先前获得的知识。 在人类联动记忆系统的启发下, 我们提出一种仿照联合学习过程的大脑式方法, 以持续学习为目的。 我们设计了一种休眠机制, 以强有力地刺激模型, 用以指导模型根据当前情况和获得的联系经验回顾历史事件。 此外, 添加了一种蒸馏措施, 以压抑性地改变合成传播的功效, 从而抑制为新任务进行特征重建学习。 框架通过强力和抑郁刺激来调节, 在引导合成和行为性可塑性方面起着相反的作用。 它不需要原始数据, 更类似于人类认知过程。 实验表明我们的方法在减轻持续图像重建问题的灾难性遗忘方面的有效性。