Long-term time series forecasting focuses on leveraging historical data to predict future trends. The core challenge lies in effectively modeling dependencies both within sequences and channels. Convolutional Neural Networks and Linear models often excel in sequence modeling but frequently fall short in capturing complex channel dependencies. In contrast, Transformer-based models, with their attention mechanisms applied to both sequences and channels, have demonstrated strong predictive performance. Our research proposes a new approach for capturing sequence and channel dependencies: AverageTime, an exceptionally simple yet effective structure. By employing mixed channel embedding and averaging operations, AverageTime separately captures correlations for sequences and channels through channel mapping and result averaging. In addition, we integrate clustering methods to further accelerate the model's training process. Experiments on real-world datasets demonstrate that AverageTime surpasses state-of-the-art models in predictive performance while maintaining efficiency comparable to lightweight linear models. This provides a new and effective framework for modeling long time series.
翻译:暂无翻译