Recurrent neural networks (RNNs), temporal convolutions, and neural differential equations (NDEs) are popular families of deep learning models for time-series data, each with unique strengths and tradeoffs in modeling power and computational efficiency. We introduce a simple sequence model inspired by control systems that generalizes these approaches while addressing their shortcomings. The Linear State-Space Layer (LSSL) maps a sequence $u \mapsto y$ by simply simulating a linear continuous-time state-space representation $\dot{x} = Ax + Bu, y = Cx + Du$. Theoretically, we show that LSSL models are closely related to the three aforementioned families of models and inherit their strengths. For example, they generalize convolutions to continuous-time, explain common RNN heuristics, and share features of NDEs such as time-scale adaptation. We then incorporate and generalize recent theory on continuous-time memorization to introduce a trainable subset of structured matrices $A$ that endow LSSLs with long-range memory. Empirically, stacking LSSL layers into a simple deep neural network obtains state-of-the-art results across time series benchmarks for long dependencies in sequential image classification, real-world healthcare regression tasks, and speech. On a difficult speech classification task with length-16000 sequences, LSSL outperforms prior approaches by 24 accuracy points, and even outperforms baselines that use hand-crafted features on 100x shorter sequences.
翻译:经常性神经网络(RNNS)、时间变迁和神经差异方程式(NDEs)是时间序列数据中最受欢迎的深学习模型组合,每个模型在建模能力和计算效率方面都具有独特的优势和权衡。我们引入了一种由控制系统启发的简单序列模型,这些系统在解决这些缺陷的同时概括了这些方法。线形国家空间层(LSSL)绘制了一个序列 $u\ masto y $\ mapto y$,简单模拟一个线形连续连续时间国家空间代表($\dot{x} = Ax+ Bu, y = Cx + Du$。理论上,我们显示LSSL模型模型与上述三个模型组合的组合及其优势密切相关和权衡。例如,它们把变异到连续时间,解释通用的RNNNNS值,以及NDE的共享特性,例如时间尺度适应。我们随后将连续时间空间空间代表系统模拟理论纳入并概括最新的理论,引入一个结构化的基质矩阵组合组合 $A$LA$L$,该节值甚至语言精确的精确的精确语言记忆中。 时间级序列序列,在长期记忆中,在长期记忆中, 时间序列中将一个简单的序列中,在历史序列中将一个简单的序列中,在历史序列中, 级平级平级平级平级平级平级平级平级平级平级平级平级平级任务上,在连续任务上,在时间任务上,在连续任务中,在连续任务上,在长期任务中,在连续任务中,在长期任务中,在长期任务中,在连续的顺序上,在连续任务上,在连续的平级的平级任务中,在长期任务上,在长期任务上,在连续的平级平级平级平级上,在长期任务上,在长期的平级上,在时间轴上,在长期任务上,在长期任务上,在时间轴上进行上,在一系列上,在时间轴上,在一系列上,在上使用。