We study the problem of efficiently scaling ensemble-based deep neural networks for time series (TS) forecasting on a large set of time series. Current state-of-the-art deep ensemble models have high memory and computational requirements, hampering their use to forecast millions of TS in practical scenarios. We propose N-BEATS(P), a global multivariate variant of the N-BEATS model designed to allow simultaneous training of multiple univariate TS forecasting models. Our model addresses the practical limitations of related models, reducing the training time by half and memory requirement by a factor of 5, while keeping the same level of accuracy. We have performed multiple experiments detailing the various ways to train our model and have obtained results that demonstrate its capacity to support zero-shot TS forecasting, i.e., to train a neural network on a source TS dataset and deploy it on a different target TS dataset without retraining, which provides an efficient and reliable solution to forecast at scale even in difficult forecasting conditions.
翻译:我们建议N-BEATS(P),这是N-BEATS模型的一种全球多变量变体,旨在同时培训多种单体TS预报模型。我们的模式解决了相关模型的实际局限性,将培训时间减半,将记忆要求减少5倍,同时保持同样的精确度。我们进行了多项实验,详细说明了培训模型的各种方法,并取得了一些成果,展示了支持零点TS预报的能力,即对源TS数据集进行神经网络培训,并将其用于一个不经过再培训的不同目标TS数据集,这为即使在困难的预测条件下也进行规模预测提供了高效和可靠的解决办法。