We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., liquid) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics and compute their expressive power by the trajectory length measure in latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs. Code and data are available at https://github.com/raminmh/liquid_time_constant_networks
翻译:我们引入了新型的具有时间持续时间的经常性神经网络模型。 我们不是通过隐含的非线性宣布一个学习系统的动态,而是建立通过非线性互联门调节的线性一阶动态系统的网络。 由此形成的模型代表了具有不同( 液体) 时间分流和其隐藏状态的动态系统, 其输出由数字差异方程解析器计算。 这些神经网络表现出稳定、 受约束的行为, 在神经普通差异方程式中产生超强的表达性, 并导致时间序列预测任务的性能得到改进。 为了展示这些特性, 我们首先采取理论方法, 找出其动态的界限, 并通过潜轨迹空间的轨迹长度测量来计算其表达力 。 然后我们进行一系列时间序列预测实验, 以显示流动时间- 康斯坦特网络( LTCs) 与经典和现代 RNNSs 的近似能力。 代码和数据可在 https://github.com/ramimmh/ lid_time_contant_networks 上查阅 。