The ability to learn continuously from an incoming data stream without catastrophic forgetting is critical for designing intelligent systems. Many existing approaches to continual learning rely on stochastic gradient descent and its variants. However, these algorithms have to implement various strategies, such as memory buffers or replay, to overcome well-known shortcomings of stochastic gradient descent methods in terms of stability, greed, and short-term memory. To that end, we develop a biologically-inspired light weight neural network architecture that incorporates local learning and neuromodulation to enable input processing over data streams and online learning. Next, we address the challenge of hyperparameter selection for tasks that are not known in advance by implementing transfer metalearning: using a Bayesian optimization to explore a design space spanning multiple local learning rules and their hyperparameters, we identify high performing configurations in classical single task online learning and we transfer them to continual learning tasks with task-similarity considerations. We demonstrate the efficacy of our approach on both single task and continual learning setting. For the single task learning setting, we demonstrate superior performance over other local learning approaches on the MNIST, Fashion MNIST, and CIFAR-10 datasets. Using high performing configurations metalearned in the single task learning setting, we achieve superior continual learning performance on Split-MNIST, and Split-CIFAR-10 data as compared with other memory-constrained learning approaches, and match that of the state-of-the-art memory-intensive replay-based approaches.
翻译:从进取的数据流中不断学习而不造成灾难性的遗忘的能力对于设计智能系统至关重要。许多现有的持续学习方法取决于随机梯度梯度下降及其变异。然而,这些算法必须实施各种战略,例如记忆缓冲或回放,以克服在稳定性、贪婪和短期记忆方面已知的随机梯度梯度下降方法的缺点。为此,我们开发了一个生物启发的轻重神经网络架构,将本地学习和神经调节结合起来,以便能够对数据流和在线学习进行输入处理。接下来,我们通过实施转移金属学习,应对为那些事先不为人所知的任务选择超参数的挑战:利用巴伊西亚优化来探索一个涵盖多种地方学习规则及其超参数的设计空间,我们在传统的单项任务在线学习中发现高性能配置,并将这些配置转移到具有任务相似性考虑的不断学习任务。我们展示了在单项任务和持续学习中采用的方法的有效性。对于单一任务学习方式,我们展示了在MINIT、FAS-M-MR-S-S-SL-SB-SL-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-S-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-SD-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-SD-SD-SD-SD-S-S-SD-S-SD-SD-SD-SD-SD-SD-SD-S-S-SD-SD-S-S-SD-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-SD-S-S-S-S