For many years, Evolutionary Algorithms (EAs) have been applied to improve Neural Networks (NNs) architectures. They have been used for solving different problems, such as training the networks (adjusting the weights), designing network topology, optimizing global parameters, and selecting features. Here, we provide a systematic brief survey about applications of the EAs on the specific domain of the recurrent NNs named Reservoir Computing (RC). At the beginning of the 2000s, the RC paradigm appeared as a good option for employing recurrent NNs without dealing with the inconveniences of the training algorithms. RC models use a nonlinear dynamic system, with fixed recurrent neural network named the \textit{reservoir}, and learning process is restricted to adjusting a linear parametric function. %so the performance of learning is fast and precise. However, an RC model has several hyper-parameters, therefore EAs are helpful tools to figure out optimal RC architectures. We provide an overview of the results on the area, discuss novel advances, and we present our vision regarding the new trends and still open questions.
翻译:多年来,一直在应用进化算法来改进神经网络结构,这些计算法被用于解决各种问题,例如培训网络(调整重量)、设计网络地形、优化全球参数和选择特征。在这里,我们提供系统化的简要调查,说明EA在经常性 NNS 称为Reservoir Excall (RC)的具体领域的应用情况。在2000年代初,RC 模式似乎是在不处理培训算法的不便的情况下使用经常性NNS的良好选择。RC 模型使用非线性动态系统,固定的经常性神经网络称为\ textit{reservoir},学习过程仅限于调整线性准函数。% 因此,学习的绩效是快速和精确的。但是,RC 模型有几个超参数,因此EA是绘制最佳RC 结构的有用工具。我们概述了该领域的结果,讨论了新的进展,我们提出了关于新趋势和仍然开放问题的愿景。