Many recurrent neural network machine learning paradigms can be formulated using state-space representations. The classical notion of canonical state-space realization is adapted in this paper to accommodate semi-infinite inputs so that it can be used as a dimension reduction tool in the recurrent networks setup. The so-called input forgetting property is identified as the key hypothesis that guarantees the existence and uniqueness (up to system isomorphisms) of canonical realizations for causal and time-invariant input/output systems with semi-infinite inputs. Additionally, the notion of optimal reduction coming from the theory of symmetric Hamiltonian systems is implemented in our setup to construct canonical realizations out of input forgetting but not necessarily canonical ones. These two procedures are studied in detail in the framework of linear fading memory input/output systems. Finally, the notion of implicit reduction using reproducing kernel Hilbert spaces (RKHS) is introduced which allows, for systems with linear readouts, to achieve dimension reduction without the need to actually compute the reduced spaces introduced in the first part of the paper.
翻译:许多反复出现的神经网络机器学习模式可以用国家空间表示形式来制定。本文件对典型的“恒星状态-空间实现”概念进行了调整,以适应半无限的投入,从而可以用作经常性网络设置中减少维度的工具。所谓的“忽略属性”被确定为保证因果和时间变化输入/输出系统(包括半无限输入)的恒星认知的存在和独特性的关键假设。此外,从对称汉密尔顿系统理论中得出的“优化减少”概念,正在我们的构思中落实,以构建从投入遗忘中实现的星系实现,但不一定是空洞。这两个程序在线性记忆输入/输出系统框架内进行了详细研究。最后,引入了“通过再生产内核耳机空间(RKHS)”的隐含的减少概念,允许有线性读出系统的系统实现减少维度,而无需实际计算文件第一部分所引入的缩小的空间。