Long Short-Term Memory (LSTM) neural networks have been widely used for time series forecasting problems. However, LSTMs are prone to overfitting and performance reduction during test phases. Several different regularization techniques have been shown in literature to prevent overfitting problems in neural networks. In this paper, first, we introduce application of kernel flow methods for time series forecasting in general. Afterward, we examine the effectiveness of applying kernel flow regularization on LSTM layers to avoid overfitting problems. We describe a regularization method by applying kernel flow loss function on LSTM layers. In experimental results, we show that kernel flow outperforms baseline models on time series forecasting benchmarks. We also compare the effect of dropout and kernel flow regularization techniques on LSTMs. The experimental results illustrate that kernel flow achieves similar regularization effect to dropout. It also shows that the best results is obtained using both kernel flow and dropout regularizations with early stopping on LSTM layers on some time series datasets (e.g. power-load demand forecasts).
翻译:长期内存(LSTM)神经网络被广泛用于时间序列预测问题,然而,LSTMS很容易在试验阶段过度装配和降低性能,在文献中展示了几种不同的正规化技术,以防止神经网络出现过度装配问题。在本文中,我们首先将内核流法应用于一般的时间序列预报;之后,我们审查在LSTM层应用内核流正规化以避免过度装配问题的有效性。我们通过在LSTM层应用内核流流失功能来描述一种正规化方法。在实验结果中,我们显示内核流流超越时间序列预测基准的基线模型。我们还比较了辍学和内核流正规化技术对LSTMs的影响。实验结果表明内核流具有类似的正规化作用,从而避免出现过度装配问题。还表明,最佳的结果是利用内核流和辍学正规化方法,在一些时间序列数据集(例如电荷需求预测)上提前停止LSTM层。