Reservoir Computing (RC) is an appealing approach in Machine Learning that combines the high computational capabilities of Recurrent Neural Networks with a fast and easy training method. Likewise, successful implementation of neuro-inspired plasticity rules into RC artificial networks has boosted the performance of the original models. In this manuscript, we analyze the role that plasticity rules play on the changes that lead to a better performance of RC. To this end, we implement synaptic and non-synaptic plasticity rules in a paradigmatic example of RC model: the Echo State Network. Testing on nonlinear time series prediction tasks, we show evidence that improved performance in all plastic models are linked to a decrease of the pair-wise correlations in the reservoir, as well as a significant increase of individual neurons ability to separate similar inputs in their activity space. Here we provide new insights on this observed improvement through the study of different stages on the plastic learning. From the perspective of the reservoir dynamics, optimal performance is found to occur close to the so-called edge of instability. Our results also show that it is possible to combine different forms of plasticity (namely synaptic and non-synaptic rules) to further improve the performance on prediction tasks, obtaining better results than those achieved with single-plasticity models.
翻译:储量计算(RC)是机械学习中的一种具有吸引力的方法,它把经常性神经网络的高计算能力与快速和容易的培训方法结合起来。同样,在RC人造网络中成功实施神经启发型可塑性规则提高了原始模型的性能。在这个手稿中,我们分析了塑料规则在促成RC更好表现的变化方面所起的作用。为此,我们在RC模型的范例范例中实施了合成和非合成可塑性规则:回声国家网络。测试非线性时间序列预测任务。我们显示,所有塑料模型的性能提高都与储油层双向相关性的下降相关联,以及个体神经元在活动空间分离类似投入的能力的显著提高有关。我们在这里通过对塑料学习的不同阶段的研究,就观察到的改进提供了新的见解。从储油层动态的角度来看,最佳性能可以接近所谓的不稳定边缘。我们的成果还表明,有可能进一步将不同形式的可塑性(即合成型和不完善的性能模型)与改进性能规则相结合。