Echo State Networks represent a type of recurrent neural network with a large randomly generated reservoir and a small number of readout connections trained via linear regression. The most common topology of the reservoir is a fully connected network of up to thousands of neurons. Over the years, researchers have introduced a variety of alternative reservoir topologies, such as a circular network or a linear path of connections. When comparing the performance of different topologies or other architectural changes, it is necessary to tune the hyperparameters for each of the topologies separately since their properties may significantly differ. The hyperparameter tuning is usually carried out manually by selecting the best performing set of parameters from a sparse grid of predefined combinations. Unfortunately, this approach may lead to underperforming configurations, especially for sensitive topologies. We propose an alternative approach of hyperparameter tuning based on the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Using this approach, we have improved multiple topology comparison results by orders of magnitude suggesting that topology alone does not play as important role as properly tuned hyperparameters.
翻译:Echo State Network 代表着一种经常的神经网络,其中有大量随机生成的储油层和少量的通过线性回归培训的读取连接。储油层最常见的地形是完全连接数千个神经元的网络。多年来,研究人员采用了各种不同的储油层表层,例如循环网络或线性连接路径。在比较不同地形或其他建筑变化的性能时,有必要对每个表层的超参数进行调和,因为它们的特性可能大不相同。超参数调通常通过从分散的预设组合网中选择最佳的参数集手工进行。不幸的是,这种方法可能导致表现不佳的配置,特别是对于敏感的表层。我们建议了一种基于可变性矩阵适应进化战略(CMA-ES)的超参数调制替代方法。使用这种方法,我们改进了多个表层的比较结果,从数量上看可能大不尽相同。 超光是表理学在适当调超光谱方面没有起到重要作用。