We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., \emph{liquid}) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics, and compute their expressive power by the \emph{trajectory length} measure in a latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to modern RNNs. Code and data are available at https://github.com/raminmh/liquid_time_constant_networks
翻译:我们引入了新型的、持续时间的经常性神经网络模型。 我们不是通过隐含的非线性宣布学习系统的动态,而是通过非线性连接门来建立线性第一阶动态系统的网络。 由此形成的模型代表了各种动态系统( 即 emph{ 液态 ) 的时间序列, 加上其隐藏状态, 产出由数字差异方程解析器计算。 这些神经网络表现稳定、 受约束的行为, 在神经普通差异方程式中产生超强的表达性, 并在时间序列预测任务中产生更好的性能。 为了展示这些特性, 我们首先采取理论方法, 找出其动态的界限, 并在潜在轨迹空间中根据 emph{ jectory long} 测量其表达力。 然后我们进行一系列的时间序列预测实验, 以显示液化时间- Constant 网络( LTCs) 与现代 RNNS 的近似能力。 代码和数据可在 https://github.com/raminh/ limin/ client_ctime_ constate_ constal_ nets 上查阅 。