This paper shows that masked autoencoder with interpolator (InterpoMAE) is a scalable self-supervised generative model for time series. InterpoMAE masks random patches of the input time series and recover the missing patches in latent space. The core design is that no mask token is used. InterpoMAE disentangles missing patch recovery from the decoder. An interpolator directly recovers the missing patches without mask tokens. This design helps InterpoMAE to consistently and significantly outperforms state-of-the-art (SoTA) benchmarks in time series generation. Our approach also shows promising scaling behaviour in various downstream tasks such as time series classification, prediction and imputation. As the only self-supervised generative model for time series, InterpoMAE is the first in literature that allows explicit management on the synthetic data. Time series generation may follow the trajectory of self-supervised learning now.
翻译:本文显示, 带有内插器( InterpoMAE) 的蒙面自动编码器是一个可缩放的自我监督基因模型, 用于时间序列 。 InterpoMAE 掩码器随机覆盖输入时间序列的补丁, 并恢复隐蔽空间中缺失的补丁。 核心设计是没有使用掩码符号 。 InterpoMAE 解析解析解密器缺失的补丁。 一个内插器直接恢复了没有掩码符号的缺失补丁。 这个设计有助于 InterpoMAE 持续且显著地超过时间序列生成中的最新( SoTA) 基准 。 我们的方法还显示在时间序列分类、 预测和 估算等各种下游任务中有希望的缩放行为。 作为时间序列中唯一自上显示的自我监督基因化模型, InterpoMAE 是文献中第一个允许对合成数据进行明确管理的文献中的第一个。 时间序列的生成可能遵循当前自我超强学习的轨迹 。