The Echo State Network (ESN) is a class of Recurrent Neural Network with a large number of hidden-hidden weights (in the so-called reservoir). Canonical ESN and its variations have recently received significant attention due to their remarkable success in the modeling of non-linear dynamical systems. The reservoir is randomly connected with fixed weights that don't change in the learning process. Only the weights from reservoir to output are trained. Since the reservoir is fixed during the training procedure, we may wonder if the computational power of the recurrent structure is fully harnessed. In this article, we propose a new computational model of the ESN type, that represents the reservoir weights in the Fourier space and performs a fine-tuning of these weights applying genetic algorithms in the frequency domain. The main interest is that this procedure will work in a much smaller space compared to the classical ESN, thus providing a dimensionality reduction transformation of the initial method. The proposed technique allows us to exploit the benefits of the large recurrent structure avoiding the training problems of gradient-based method. We provide a detailed experimental study that demonstrates the good performances of our approach with well-known chaotic systems and real-world data.
翻译:Echo State Network(ESN)是经常神经网络的一类,经常有大量的隐藏隐藏的重量(在所谓的储油层中),Canonic ENN及其变异最近由于在非线性动态系统的模型模型中取得显著成功而引起人们的极大关注。储油层随机地与固定重量相联系,在学习过程中不会改变。只有水库到输出的重量才经过培训。由于储油层是在培训过程中固定的,我们可能会怀疑循环结构的计算能力是否得到充分利用。在本篇文章中,我们提出了一个新的ESN型计算模型,代表着Fourier空间的储油层重量,对这些重量进行微调,在频率域应用基因算法。主要的兴趣是,这一程序将在一个比传统的ENN更小得多的空间运作,从而对初始方法进行维度的转换。拟议的技术使我们能够利用大型经常性结构的效益,避免基于梯度方法的培训问题。我们提供了详细的实验性研究,展示了我们的方法与众所周知的数据系统的实际混乱状态的良好表现。