Unsupervised/self-supervised time series representation learning is a challenging problem because of its complex dynamics and sparse annotations. Existing works mainly adopt the framework of contrastive learning with the time-based augmentation techniques to sample positives and negatives for contrastive training. Nevertheless, they mostly use segment-level augmentation derived from time slicing, which may bring about sampling bias and incorrect optimization with false negatives due to the loss of global context. Besides, they all pay no attention to incorporate the spectral information in feature representation. In this paper, we propose a unified framework, namely Bilinear Temporal-Spectral Fusion (BTSF). Specifically, we firstly utilize the instance-level augmentation with a simple dropout on the entire time series for maximally capturing long-term dependencies. We devise a novel iterative bilinear temporal-spectral fusion to explicitly encode the affinities of abundant time-frequency pairs, and iteratively refines representations in a fusion-and-squeeze manner with Spectrum-to-Time (S2T) and Time-to-Spectrum (T2S) Aggregation modules. We firstly conducts downstream evaluations on three major tasks for time series including classification, forecasting and anomaly detection. Experimental results shows that our BTSF consistently significantly outperforms the state-of-the-art methods.
翻译:无监督/自我监督的时间序列代表性学习是一个具有挑战性的问题,因为其动态复杂,说明不多。现有工作主要采用与基于时间的增强技术对比学习的框架,将基于时间的增强技术用于对比性培训的正数和负数样本。然而,它们大多使用时间切分产生的部分级增强,这可能导致抽样偏差,以错误的负差进行不正确的优化,因为全球背景的丧失,这可能导致大量时间频谱配对的亲近性。此外,它们都不重视将光谱信息纳入特征代表中。在本文件中,我们提议了一个统一框架,即双线时光谱聚合(BTSF)和时间到频谱融合(BTSFF)。具体地说,我们首先利用在整个时间序列中简单丢弃的例级增强,以最大限度地捕捉长期依赖性。我们设计了一个新型的迭代双线时间光谱聚合,以明确解算出大量时间频谱配对的相的近性。此外,并反复完善的表达方式,与Spectrum-ticrophy-tium(S2TS)和时间-spect-spect-crespect-cretraction-rodustration-traction-traction-traction-traculation 3 wes-stal-stal-laction-laveal-traction-laction-laction-laction-st-laction-traction-traction-traction-traction-traction-traction-tragal-traction-tragal-tragal-tragal-tragal-traction-traction-traction-traction-traction-traction-traction-traction-traction-traction-traction-lads-s-s-traction-tractionsmal-sal-smal-ladsmal-ladsmal-ladsal-ladal-laction-ladal-ladal-ladal-ladal-lads-ladal-ladal-ladal-s-s-s-ladal-ladal-ladal-ladal-ladal-ladal-ladal-Adal-ladal-