Neural style transfer is a powerful computer vision technique that can incorporate the artistic "style" of one image to the "content" of another. The underlying theory behind the approach relies on the assumption that the style of an image is represented by the Gram matrix of its features, which is typically extracted from pre-trained convolutional neural networks (e.g., VGG-19). This idea does not straightforwardly extend to time series stylization since notions of style for two-dimensional images are not analogous to notions of style for one-dimensional time series. In this work, a novel formulation of time series style transfer is proposed for the purpose of synthetic data generation and enhancement. We introduce the concept of stylized features for time series, which is directly related to the time series realism properties, and propose a novel stylization algorithm, called StyleTime, that uses explicit feature extraction techniques to combine the underlying content (trend) of one time series with the style (distributional properties) of another. Further, we discuss evaluation metrics, and compare our work to existing state-of-the-art time series generation and augmentation schemes. To validate the effectiveness of our methods, we use stylized synthetic data as a means for data augmentation to improve the performance of recurrent neural network models on several forecasting tasks.
翻译:神经风格转换是一种强大的计算机视觉技术,可以将一个图像的艺术“风格”与另一个图像的“内容”融合在一起。该方法背后的理论依据的假设是,一个图像的风格以其特征的格拉姆矩阵(Gram 矩阵)为代表,该矩阵通常是从预先训练的进化神经神经网络(如VGG-19)中提取的。这一想法并不直接延伸到时间序列的星体化,因为二维图像的风格概念与一维时间序列的风格概念并不类似。在这项工作中,为了合成数据生成和增强的目的,提出了时间序列风格转换的新构思。我们引入了时间序列的星体化特征概念,它与时间序列的现实特性直接相关,并提出了一种叫作StylateTime的新颖的星体化算法,它使用明确的特征提取技术将一个时序列的基本内容(趋势)与另一个时序(分配属性)的风格组合。此外,我们讨论评估指标,并将我们的工作与现有的时序时间序列生成和增强系统计划进行比较。我们使用若干个数据周期性模型的合成预测方法,用以验证我们不断改进数据预测的方法。