Reservoir computing approximation and generalization bounds are proved for a new concept class of input/output systems that extends the so-called generalized Barron functionals to a dynamic context. This new class is characterized by the readouts with a certain integral representation built on infinite-dimensional state-space systems. It is shown that this class is very rich and possesses useful features and universal approximation properties. The reservoir architectures used for the approximation and estimation of elements in the new class are randomly generated echo state networks with either linear or ReLU activation functions. Their readouts are built using randomly generated neural networks in which only the output layer is trained (extreme learning machines or random feature neural networks). The results in the paper yield a fully implementable recurrent neural network-based learning algorithm with provable convergence guarantees that do not suffer from the curse of dimensionality.
翻译:在一种新的输入/输出系统的概念类中,将储层计算的逼近和泛化边界证明扩展到了所谓的广义Barron函数上下文中。这个新类别的特征是使用基于无限维状态空间系统的积分表示构建的读出器。证明了这个类别非常丰富,具有有用的特性和普遍逼近特性。用于逼近和估计新类别元素的储层架构是使用线性或ReLU激活函数的随机生成的回声状态网络。它们的输出是使用仅训练输出层的随机生成的神经网络构建的(极限学习机或随机特征神经网络)。本文的结果提供了一个可完全实现的基于递归神经网络的学习算法,具有可证明的收敛保证,不受维数灾难的影响。