According to the Complementary Learning Systems (CLS) theory~\cite{mcclelland1995there} in neuroscience, humans do effective \emph{continual learning} through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow learning system located in the neocortex for the gradual acquisition of structured knowledge about the environment. Motivated by this theory, we propose \emph{DualNets} (for Dual Networks), a general continual learning framework comprising a fast learning system for supervised learning of pattern-separated representation from specific tasks and a slow learning system for representation learning of task-agnostic general representation via Self-Supervised Learning (SSL). DualNets can seamlessly incorporate both representation types into a holistic framework to facilitate better continual learning in deep neural networks. Via extensive experiments, we demonstrate the promising results of DualNets on a wide range of continual learning protocols, ranging from the standard offline, task-aware setting to the challenging online, task-free scenario. Notably, on the CTrL~\cite{veniat2020efficient} benchmark that has unrelated tasks with vastly different visual images, DualNets can achieve competitive performance with existing state-of-the-art dynamic architecture strategies~\cite{ostapenko2021continual}. Furthermore, we conduct comprehensive ablation studies to validate DualNets efficacy, robustness, and scalability. Code is publicly available at \url{https://github.com/phquang/DualNet}.
翻译:根据补充学习系统(CLS)理论<cite{mclelland{mcleland{mcleland{1995}在神经科学中,人类通过两个互补系统进行有效的学习:一个在河马坎普的快速学习系统,以快速学习具体细节和个人经验;另一个在新皮层的缓慢学习系统,以逐步获得关于环境的结构性知识。根据这一理论,我们提议了“eph{dualNet}”(对双线网络来说),一个一般性的持续学习框架,包括一个监督学习系统,从具体任务中分离模式代表的快速学习系统,以及一个通过自超学习(SSL)代表任务-不可知的一般代表的缓慢学习系统。双轨网络可以无缝地将两种代表类型纳入一个整体框架,以便于在深层神经网络中更好地不断学习。通过广泛的实验,我们展示了双线网络在一系列持续学习协议方面的有希望的结果,从标准的离线、任务认知到具有挑战性的在线、任务-自由的情景。