Techniques of hybridisation and ensemble learning are popular model fusion techniques for improving the predictive power of forecasting methods. With limited research that instigates combining these two promising approaches, this paper focuses on the utility of the Exponential-Smoothing-Recurrent Neural Network (ES-RNN) in the pool of base models for different ensembles. We compare against some state of the art ensembling techniques and arithmetic model averaging as a benchmark. We experiment with the M4 forecasting data set of 100,000 time-series, and the results show that the Feature-based Forecast Model Averaging (FFORMA), on average, is the best technique for late data fusion with the ES-RNN. However, considering the M4's Daily subset of data, stacking was the only successful ensemble at dealing with the case where all base model performances are similar. Our experimental results indicate that we attain state of the art forecasting results compared to N-BEATS as a benchmark. We conclude that model averaging is a more robust ensemble than model selection and stacking strategies. Further, the results show that gradient boosting is superior for implementing ensemble learning strategies.
翻译:混合和混合学习技术是改进预测方法的预测力的流行模型集成技术。由于研究有限,将这两种有希望的方法结合起来,本文件侧重于不同组合的基模型库中的光学-移动-实时神经网络(ES-RNNN)的有用性。我们比较了某种水平的艺术组合技术和算术模型,平均作为一个基准。我们试验了10万个时间序列的M4预测数据集,结果显示,平均而言,基于特性的预测模型变异(FFFFORMA)是晚期数据与ES-RNN融合的最佳技术。然而,考虑到M4的每日数据组合,堆叠是处理所有基础模型性能都相似的案件的唯一成功组合。我们的实验结果表明,我们达到了与N-BEATS相比的艺术预测结果状态。我们的结论是,平均模型比模型选择模型和堆叠战略更坚固的组合。此外,结果显示,升级战略是升级学习的高级战略。