We study the problem of efficiently scaling ensemble-based deep neural networks for multi-step time series (TS) forecasting on a large set of time series. Current state-of-the-art deep ensemble models have high memory and computational requirements, hampering their use to forecast millions of TS in practical scenarios. We propose N-BEATS(P), a global parallel variant of the N-BEATS model designed to allow simultaneous training of multiple univariate TS forecasting models. Our model addresses the practical limitations of related models, reducing the training time by half and memory requirement by a factor of 5, while keeping the same level of accuracy in all TS forecasting settings. We have performed multiple experiments detailing the various ways to train our model and have obtained results that demonstrate its capacity to generalize in various forecasting conditions and setups.
翻译:我们研究如何在大型时间序列上有效扩大多步时间序列(TS)的多步预测的基于共通的深神经网络。目前最先进的深层共通模型具有很高的记忆和计算要求,妨碍在实际情景下使用这些模型来预测数百万个TS。我们建议N-BEATS(P),这是N-BEATS(P)模型的一个全球平行变体,旨在同时培训多个单向TS预报模型。我们的模式解决了相关模型的实际局限性,将培训时间减半,将记忆要求减少5倍,同时在所有TS预报环境中保持同样的精确度。我们进行了多项实验,详细说明了培训模型的各种方法,并取得了成果,展示了在各种预测条件和设置中推广的能力。