Modeling multivariate time series has long been a subject that has attracted researchers from a diverse range of fields including economics, finance, and traffic. A basic assumption behind multivariate time series forecasting is that its variables depend on one another but, upon looking closely, it is fair to say that existing methods fail to fully exploit latent spatial dependencies between pairs of variables. In recent years, meanwhile, graph neural networks (GNNs) have shown high capability in handling relational dependencies. GNNs require well-defined graph structures for information propagation which means they cannot be applied directly for multivariate time series where the dependencies are not known in advance. In this paper, we propose a general graph neural network framework designed specifically for multivariate time series data. Our approach automatically extracts the uni-directed relations among variables through a graph learning module, into which external knowledge like variable attributes can be easily integrated. A novel mix-hop propagation layer and a dilated inception layer are further proposed to capture the spatial and temporal dependencies within the time series. The graph learning, graph convolution, and temporal convolution modules are jointly learned in an end-to-end framework. Experimental results show that our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets and achieves on-par performance with other approaches on two traffic datasets which provide extra structural information.
翻译:多变时间序列长期以来一直是吸引经济、金融、交通等不同领域研究人员的一个主题。多变时间序列预测的基本假设是,其变量取决于彼此,但仔细看,可以公平地说,现有方法未能充分利用变数组合之间的潜在空间依赖性。近年来,图形神经网络(GNNS)在处理关系依赖性方面表现出了很高的能力。GNNS要求为信息传播建立明确界定的图形结构,这意味着它们不能直接用于多种变数时间序列,而多变数时间序列的依存是事先不为人所知的。在本文件中,我们提议了一个专门为多变数时间序列数据设计的普通图形神经网络框架。我们的方法通过一个图形学习模块自动地提取各种变量之间的单向关系,而外部知识如变数属性可以很容易地整合到其中。还进一步提议了一个新型混合式传播层和一个缩略初层,以捕捉时间序列内的空间和时间依赖性关系,这意味着它们不能直接应用于多变数时间序列。图形学习、图形变数模型和时变数组合网络框架中,用两个结构变数模型和时间变数变数模型来联合地标定结构模型,通过一个图表学习其他数据模型,在最终框架中,从而显示其他数据模型上的结果。