Continual learning (CL) in the brain is facilitated by a complex set of mechanisms. This includes the interplay of multiple memory systems for consolidating information as posited by the complementary learning systems (CLS) theory and synaptic consolidation for protecting the acquired knowledge from erasure. Thus, we propose a general CL method that creates a synergy between SYNaptic consolidation and dual memory Experience Replay (SYNERgy). Our method maintains a semantic memory that accumulates and consolidates information across the tasks and interacts with the episodic memory for effective replay. It further employs synaptic consolidation by tracking the importance of parameters during the training trajectory and anchoring them to the consolidated parameters in the semantic memory. To the best of our knowledge, our study is the first to employ dual memory experience replay in conjunction with synaptic consolidation that is suitable for general CL whereby the network does not utilize task boundaries or task labels during training or inference. Our evaluation on various challenging CL scenarios and characteristics analyses demonstrate the efficacy of incorporating both synaptic consolidation and CLS theory in enabling effective CL in DNNs.
翻译:由一套复杂的机制促进大脑中的持续学习,其中包括多种记忆系统的相互作用,以综合补充学习系统(CLS)理论和合成整合所假设的信息,保护获得的知识免遭消化。因此,我们提议一种一般的CL方法,在SyNaptic整合和双重记忆回放(SyNenergy)之间产生协同作用。我们的方法保持一种语义记忆,在任务之间积累和整合信息,并与分流记忆相互作用,以便有效重播。它进一步采用合成系统,在培训轨迹期间跟踪参数的重要性,并将参数固定在语义记忆中的综合参数上。就我们的知识而言,我们的研究首先利用双重记忆经验,与一般CL(Synaptic 整合) 和双重记忆再整合相结合,使网络在培训或推断期间不使用任务界限或任务标签。我们对各种具有挑战性的CL情景和特征分析的评估表明,在DNUS中将合成参数和CLS理论纳入有效的CL(CL)的功效。