The technology of traffic flow forecasting plays an important role in intelligent transportation systems. Based on graph neural networks and attention mechanisms, most previous works utilize the transformer architecture to discover spatiotemporal dependencies and dynamic relationships. However, they have not considered correlation information among spatiotemporal sequences thoroughly. In this paper, based on the maximal information coefficient, we present two elaborate spatiotemporal representations, spatial correlation information (SCorr) and temporal correlation information (TCorr). Using SCorr, we propose a correlation information-based spatiotemporal network (CorrSTN) that includes a dynamic graph neural network component for integrating correlation information into spatial structure effectively and a multi-head attention component for modeling dynamic temporal dependencies accurately. Utilizing TCorr, we explore the correlation pattern among different periodic data to identify the most relevant data, and then design an efficient data selection scheme to further enhance model performance. The experimental results on the highway traffic flow (PEMS07 and PEMS08) and metro crowd flow (HZME inflow and outflow) datasets demonstrate that CorrSTN outperforms the state-of-the-art methods in terms of predictive performance. In particular, on the HZME (outflow) dataset, our model makes significant improvements compared with the ASTGNN model by 12.7%, 14.4% and 27.4% in the metrics of MAE, RMSE and MAPE, respectively.
翻译:交通流量预测技术在智能运输系统中发挥了重要作用。基于图形神经网络和关注机制,大多数先前的工程都利用变压器结构结构来发现时际依赖和动态关系。然而,它们没有彻底考虑时空序列之间的相互关系。在本文中,根据最大信息系数,我们提出了两种复杂的时空表现、空间相关性信息(SCorr)和时间相关性信息(TCorr)。使用SCorr,我们提议建立一个基于信息的相干空间时空网络(CorrSTN),其中包括一个将相关信息有效纳入空间结构的动态神经网络部分,以及一个用于模拟动态时间依赖关系的多头关注部分。利用TCorr,我们探索不同周期数据之间的关联模式,以确定最相关的数据,然后设计一个高效的数据选择计划,以进一步加强模型性能。关于高速公路交通流量(PEMMS07和PEMS08)和地表人群流动(HZME 流入和流出)的实验结果显示,将A-ST-MINER模型、12MIMN数据流中的特定预测性能分别超越了A-S-MINA-MER 的模型、12MER-S-S-S-S-ropal-sal-sal-sal-s-s-pal-pal-pal-pal-s-s-pal-s-sal-axxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxl-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-