The time-series forecasting (TSF) problem is a traditional problem in the field of artificial intelligence. Models such as Recurrent Neural Network (RNN), Long Short Term Memory (LSTM), and GRU (Gate Recurrent Units) have contributed to improving the predictive accuracy of TSF. Furthermore, model structures have been proposed to combine time-series decomposition methods, such as seasonal-trend decomposition using Loess (STL) to ensure improved predictive accuracy. However, because this approach is learned in an independent model for each component, it cannot learn the relationships between time-series components. In this study, we propose a new neural architecture called a correlation recurrent unit (CRU) that can perform time series decomposition within a neural cell and learn correlations (autocorrelation and correlation) between each decomposition component. The proposed neural architecture was evaluated through comparative experiments with previous studies using five univariate time-series datasets and four multivariate time-series data. The results showed that long- and short-term predictive performance was improved by more than 10%. The experimental results show that the proposed CRU is an excellent method for TSF problems compared to other neural architectures.
翻译:时间序列预测问题(TSF)是人工智能领域的一个传统问题。 经常神经网络(RNN)、长期短期内存(LSTM)和GRU(Gate 经常单位)等模型有助于提高技术系统预测的准确性。 此外,还提议了模型结构,将时间序列分解方法(如使用Loess(STL)的季节-趋势分解)等时间序列分解方法结合起来,以确保提高预测准确性。然而,由于这种方法在独立模型中为每个组成部分学习,因此无法了解时间序列各组成部分之间的关系。在本研究中,我们提议了一个新的神经结构,称为相关经常性单元(CRU),可以在神经细胞内进行时间序列分解,学习每个分解组成部分之间的相互关系(航空关系和相关性)。拟议中的神经结构是通过比较试验来评价的,前几次研究使用了5个单级时间序列数据集和4个多变时间序列数据。结果显示,长期和短期预测性性能的改进超过10 %。实验结果显示,将拟议的CRU结构与其他问题进行比较。