The fast adaptation capability of deep neural networks in non-stationary environments is critical for online time series forecasting. Successful solutions require handling changes to new and recurring patterns. However, training deep neural forecaster on the fly is notoriously challenging because of their limited ability to adapt to non-stationary environments and the catastrophic forgetting of old knowledge. In this work, inspired by the Complementary Learning Systems (CLS) theory, we propose Fast and Slow learning Networks (FSNet), a holistic framework for online time-series forecasting to simultaneously deal with abrupt changing and repeating patterns. Particularly, FSNet improves the slowly-learned backbone by dynamically balancing fast adaptation to recent changes and retrieving similar old knowledge. FSNet achieves this mechanism via an interaction between two complementary components of an adapter to monitor each layer's contribution to the lost, and an associative memory to support remembering, updating, and recalling repeating events. Extensive experiments on real and synthetic datasets validate FSNet's efficacy and robustness to both new and recurring patterns. Our code will be made publicly available.
翻译:在非静止环境中深神经网络的快速适应能力对于在线时间序列预测至关重要。成功的解决方案需要处理新和反复模式的变化。然而,培训远方的深神经预报员具有众所周知的挑战性,因为它们适应非静止环境的能力有限,而且灾难性地忘记了旧知识。在这项工作中,在补充学习系统理论的启发下,我们提出了快速和缓慢学习网络(FSNet),这是一个用于在线时间序列预测的综合框架,以便同时应对突然变化和重复的模式。特别是,FSNet通过动态地平衡对最近变化的快速适应和重新获取类似的旧知识,改善了缓慢学习的骨干。FSNet通过一个适应器的两个互补组成部分之间的互动来监测每一层对损失层的贡献,以及支持记忆、更新和回顾重复事件的联系性记忆,实现了这一机制。关于真实和合成数据集的广泛实验将验证FSNet的功效和对新和重复模式的稳健性。我们的代码将被公开使用。