Hybrid methods have been shown to outperform pure statistical and pure deep learning methods at forecasting tasks and quantifying the associated uncertainty with those forecasts (prediction intervals). One example is Exponential Smoothing Recurrent Neural Network (ES-RNN), a hybrid between a statistical forecasting model and a recurrent neural network variant. ES-RNN achieves a 9.4\% improvement in absolute error in the Makridakis-4 Forecasting Competition. This improvement and similar outperformance from other hybrid models have primarily been demonstrated only on univariate datasets. Difficulties with applying hybrid forecast methods to multivariate data include ($i$) the high computational cost involved in hyperparameter tuning for models that are not parsimonious, ($ii$) challenges associated with auto-correlation inherent in the data, as well as ($iii$) complex dependency (cross-correlation) between the covariates that may be hard to capture. This paper presents Multivariate Exponential Smoothing Long Short Term Memory (MES-LSTM), a generalized multivariate extension to ES-RNN, that overcomes these challenges. MES-LSTM utilizes a vectorized implementation. We test MES-LSTM on several aggregated coronavirus disease of 2019 (COVID-19) morbidity datasets and find our hybrid approach shows consistent, significant improvement over pure statistical and deep learning methods at forecast accuracy and prediction interval construction.
翻译:在预测任务和量化与这些预测相关的不确定性时,混合方法比纯粹的统计方法和纯粹的深层次学习方法要优于预测任务和量化与这些预测有关的不确定性(间隔期),一个例子是:指数平滑经常神经网络(ES-RNNN),这是统计预测模型和经常神经网络变体之间的混合体。ES-RNN在Makridakis-4预测竞争的绝对错误方面实现了9.4 ⁇ 改进。其他混合模型的改进和类似表现主要表现在单向数据集上。对多变数据应用混合预测方法的困难包括(美元)超平滑常经常神经网络(ES-RNNE)调整模型所涉的计算成本很高,而该模型不是极平滑的,(二美元)与数据中固有的自动或神经网络变化模型有关的挑战,以及(三)在难以捕捉到的共变差之间复杂的依赖性(交叉关系)。本文介绍了多变相表表表性平滑长期短期内存(MES-LSTM),向ES-RNNNU的通用多变相扩展延伸多变量扩展,从而克服了这些数据的精确度精确度精确度,从而克服了数据内流-MES-19-D IMLS-Risal-Risal-Risal-S-S-S-S-S-S-S-S-Risal-Risal-Risal-S-S-S-S-S-S-S-S-S-S-S-Risal-S-Risal-S-S-Risalisalisalisal-S-S-S-S-SBAR-S-S-SBAR-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-SBS-S-S-S-SBAR-S-S-SBIS-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-