Deep learning inspired by differential equations is a recent research trend and has marked the state of the art performance for many machine learning tasks. Among them, time-series modeling with neural controlled differential equations (NCDEs) is considered as a breakthrough. In many cases, NCDE-based models not only provide better accuracy than recurrent neural networks (RNNs) but also make it possible to process irregular time-series. In this work, we enhance NCDEs by redesigning their core part, i.e., generating a continuous path from a discrete time-series input. NCDEs typically use interpolation algorithms to convert discrete time-series samples to continuous paths. However, we propose to i) generate another latent continuous path using an encoder-decoder architecture, which corresponds to the interpolation process of NCDEs, i.e., our neural network-based interpolation vs. the existing explicit interpolation, and ii) exploit the generative characteristic of the decoder, i.e., extrapolation beyond the time domain of original data if needed. Therefore, our NCDE design can use both the interpolated and the extrapolated information for downstream machine learning tasks. In our experiments with 5 real-world datasets and 12 baselines, our extrapolation and interpolation-based NCDEs outperform existing baselines by non-trivial margins.
翻译:由差异方程式启发的深层学习是最近的一项研究趋势,它标志着许多机器学习任务的艺术性能状况。其中,使用神经控制差异方程式的时间序列模型被视为突破。在许多情况下,基于 NCCDE 的模型不仅比经常性神经网络(RNN)提供更好的准确性,而且还使处理不规则的时间序列成为可能。在这项工作中,我们通过重新设计其核心部分,即从离散的时间序列输入中产生连续路径来增强 NDE 。NCDE 通常使用内插算法将离散的时间序列样本转换为连续路径。然而,我们提议i)使用编码-脱coder结构来产生另一个潜在的持续路径,该结构与NCDE 的内插过程相对应,即我们的神经网络内插,与现有的明确的内插,以及二) 利用脱分解时间序列的遗传特征,即超越原始数据的时间域域外的外推推法。因此,我们NCDEP 的下游号模型设计可以同时使用我们现有的5号基底线模型, 和我们现有的内基底线上的数据设计,可以使用我们现有的Dirdal-dal-dal-dal-dal-dal-dal-dal-dal-drodrodrodal dismstrismtal dismstrutdal dismal dismal dismal des