Spiking Neural Networks (SNNs) emerged as a promising solution in the field of Artificial Neural Networks (ANNs), attracting the attention of researchers due to their ability to mimic the human brain and process complex information with remarkable speed and accuracy. This research aimed to optimise the training process of Liquid State Machines (LSMs), a recurrent architecture of SNNs, by identifying the most effective weight range to be assigned in SNN to achieve the least difference between desired and actual output. The experimental results showed that by using spike metrics and a range of weights, the desired output and the actual output of spiking neurons could be effectively optimised, leading to improved performance of SNNs. The results were tested and confirmed using three different weight initialisation approaches, with the best results obtained using the Barabasi-Albert random graph method.
翻译:在人工神经网络领域,Spiking Neural Networks(SNNS)是一个很有希望的解决办法,吸引了研究人员的注意,因为他们有能力以惊人的速度和准确性模仿人的大脑和处理复杂的信息,研究的目的是优化液态国家机器(液态国家机器)的培训过程,液态国家机器(液态国家机器)是SNN的经常性结构,确定SNN为达到理想产出和实际产出之间最小的差别而分配的最有效重量范围。实验结果显示,通过使用峰值指标和一系列重量、预期产出和注入神经元的实际产出,可以有效地优化,从而改进SNNS的性能。研究结果采用三种不同的权重初始化方法进行了测试和确认,并使用Barabasi-Albert随机图表方法取得了最佳结果。