Time series are all around in real-world applications. However, unexpected accidents for example broken sensors or missing of the signals will cause missing values in time series, making the data hard to be utilized. It then does harm to the downstream applications such as traditional classification or regression, sequential data integration and forecasting tasks, thus raising the demand for data imputation. Currently, time series data imputation is a well-studied problem with different categories of methods. However, these works rarely take the temporal relations among the observations and treat the time series as normal structured data, losing the information from the time data. In recent, deep learning models have raised great attention. Time series methods based on deep learning have made progress with the usage of models like RNN, since it captures time information from data. In this paper, we mainly focus on time series imputation technique with deep learning methods, which recently made progress in this field. We will review and discuss their model architectures, their pros and cons as well as their effects to show the development of the time series imputation methods.
翻译:时间序列在现实世界应用中都是围绕时间序列的。 但是,意外事故,例如破碎的传感器或信号缺失将造成时间序列中缺失的值,使数据难以使用。 这会损害下游应用, 如传统分类或回归、顺序数据整合和预测任务, 从而增加数据估算需求。 目前,时间序列数据估算是不同方法类别中一个研究周密的问题。 但是,这些工作很少将观测之间的时间关系作为正常的结构性数据,并将时间序列视为从时间数据中丢失的信息。 最近,深层次的学习模型引起了人们的极大关注。 以深入学习为基础的时间序列方法在使用RNN等模型方面取得了进展, 因为它从数据中捕捉了时间序列信息。 在本文中,我们主要侧重于时间序列估算技术,而深层次的学习方法最近在这方面取得了进展。 我们将审查和讨论这些模型的结构、其利弊以及它们的效果,以显示时间序列估算方法的发展。