This paper is concerned with fundamental limits on the approximation of nonlinear dynamical systems. Specifically, we show that recurrent neural networks (RNNs) can approximate nonlinear systems -- that satisfy a Lipschitz property and forget past inputs fast enough -- in metric-entropy-optimal manner. As the sets of sequence-to-sequence mappings realized by the dynamical systems we consider are significantly more massive than function classes generally analyzed in approximation theory, a refined metric-entropy characterization is needed, namely in terms of order, type, and generalized dimension. We compute these quantities for the classes of exponentially- and polynomially Lipschitz fading-memory systems and show that RNNs can achieve them.
翻译:暂无翻译