Recurrent neural networks (RNNs) are a popular choice for modeling sequential data. Modern RNN architectures assume constant time-intervals between observations. However, in many datasets (e.g. medical records) observation times are irregular and can carry important information. To address this challenge, we propose continuous recurrent units (CRUs) -- a neural architecture that can naturally handle irregular intervals between observations. The CRU assumes a hidden state, which evolves according to a linear stochastic differential equation and is integrated into an encoder-decoder framework. The recursive computations of the CRU can be derived using the continuous-discrete Kalman filter and are in closed form. The resulting recurrent architecture has temporal continuity between hidden states and a gating mechanism that can optimally integrate noisy observations. We derive an efficient parameterization scheme for the CRU that leads to a fast implementation f-CRU. We empirically study the CRU on a number of challenging datasets and find that it can interpolate irregular time series better than methods based on neural ordinary differential equations.
翻译:经常性神经网络( RNN) 是用于构建连续数据模型的流行选择。 现代 RNN 结构假设观测之间的时间间隔不变。 但是, 在许多数据集( 如医疗记录) 中, 观察时间不规则, 并且可以包含重要信息。 为了应对这一挑战, 我们建议连续的经常性单元( CDR) -- -- 一个自然可以处理不同观测间隔的神经结构。 CRU 假设一种隐藏状态, 它根据线性切分方程式演变, 并被纳入编码脱coder框架。 CRU 的循环计算可以使用连续分解的 Kalman 过滤器进行, 并且以封闭的形式进行。 由此形成的经常性结构在隐藏状态之间具有时间连续性, 并且是一个能够最佳地整合噪音观测的格子机制。 我们为 CRU 制定一个高效的参数化计划, 导致快速实施 f- CRU 。 我们用经验对具有挑战性的数据集进行了多次研究, 并发现它可以比基于神经性普通差异方程式的方法更好地将非常规时间序列相互调节 。