One of the most influential results in neural network theory is the universal approximation theorem [1, 2, 3] which states that continuous functions can be approximated to within arbitrary accuracy by single-hidden-layer feedforward neural networks. The purpose of this paper is to establish a result in this spirit for the approximation of general discrete-time linear dynamical systems - including time-varying systems - by recurrent neural networks (RNNs). For the subclass of linear time-invariant (LTI) systems, we devise a quantitative version of this statement. Specifically, measuring the complexity of the considered class of LTI systems through metric entropy according to [4], we show that RNNs can optimally learn - or identify in system-theory parlance - stable LTI systems. For LTI systems whose input-output relation is characterized through a difference equation, this means that RNNs can learn the difference equation from input-output traces in a metric-entropy optimal manner.
翻译:神经网络理论中最有影响力的结果之一是通用近似理论[1, 2, 3],该理论指出,通过单隐藏层的向向神经进料网络,连续功能可以与任意精确性相近。本文件的目的是为了建立这种精神,使普通离散时间线性动态系统(包括时间变化系统)通过经常性神经网络(RNN)接近。对于线性时间变化系统(LTI)的子类,我们设计了该语的定量版本。具体地说,通过[4] 的公吨测量,衡量考虑的LTI系统类别的复杂性,我们表明RNN可以最佳地学习-或确定系统理论性对角-稳定的LTI系统。对于投入-输出关系通过差异方程式定性的LTI系统,这意味着RNN可以以指标-耐受性最佳的方式从输入-输出轨迹中学习差异方程式。