Recurrent neural networks (RNNs) are a popular choice for modeling sequential data. Modern RNN architectures assume constant time intervals between observations. However, in many datasets (e.g. medical records) observation times are irregular and can carry important information. To address this challenge, we propose continuous recurrent units (CRUs) -- a neural architecture that can naturally handle irregular intervals between observations. The CRU assumes a hidden state, which evolves according to a linear stochastic differential equation and is integrated into an encoder-decoder framework. The recursive computations of the CRU can be derived using the continuous-discrete Kalman filter and are in closed form. The resulting recurrent architecture has temporal continuity between hidden states and a gating mechanism that can optimally integrate noisy observations. We derive an efficient parameterization scheme for the CRU that leads to a fast implementation (f-CRU). We empirically study the CRU on a number of challenging datasets and find that it can interpolate irregular time-series better than methods based on neural ordinary differential equations.
翻译:经常性神经网络( RNN) 是用于模拟连续数据的流行选择。 现代 RNN 结构在观测之间假定固定的时间间隔。 但是, 在许多数据集( 如医疗记录) 中, 观察时间不规则, 并且可以携带重要信息。 为了应对这一挑战, 我们建议连续的经常性单元( CDR) -- -- 神经结构可以自然地处理不同观测间隔。 CNU 假设一种隐藏状态, 它根据线性孔径差方程演变, 并被纳入编码交换器框架。 CRU 的循环计算可以使用连续分解的 Kalman 过滤器进行, 并且是封闭式的。 由此形成的经常性结构在隐藏状态之间具有时间连续性, 并且是一个能够最佳地整合噪音观察的定位机制。 我们为 CRU 提出了一个高效的参数化计划, 导致快速执行( f- CRU ) 。 我们用经验对有挑战的数据集数项进行了研究, 并发现它可以将不规则的时间序列与基于神经普通差方程的方法进行干涉的更好。