During the first part of life, the brain develops while it learns through a process called synaptogenesis. The neurons, growing and interacting with each other, create synapses. However, eventually the brain prunes those synapses. While previous work focused on learning and pruning independently, in this work we propose a biologically plausible model that, thanks to a combination of Hebbian learning and pruning, aims to simulate the synaptogenesis process. In this way, while learning how to solve the task, the agent translates its experience into a particular network structure. Namely, the network structure builds itself during the execution of the task. We call this approach Self-building Neural Network (SBNN). We compare our proposed SBNN with traditional neural networks (NNs) over three classical control tasks from OpenAI. The results show that our model performs generally better than traditional NNs. Moreover, we observe that the performance decay while increasing the pruning rate is smaller in our model than with NNs. Finally, we perform a validation test, testing the models over tasks unseen during the learning phase. In this case, the results show that SBNNs can adapt to new tasks better than the traditional NNs, especially when over $80\%$ of the weights are pruned.
翻译:在生命的早期阶段,大脑通过一种称为突触发生的过程来发展并学习。神经元之间的生长和相互作用创建突触。然而,最终大脑会修剪它们。尽管以前的工作侧重于学习和修剪的独立运用,本文提出了一个生物学合理的模型,通过综合使用赫布学习和修剪来模拟突触发生过程。通过这种方式,在学习如何解决任务的同时,智能体将其经验转化为特定的网络结构。也就是说,在任务执行过程中,网络结构自我构建。我们称之为自构建神经网络(SBNN)。我们将我们提出的SBNN与传统神经网络(NNs)在OpenAI的三项经典控制任务中进行比较。结果表明,我们的模型通常比传统NN表现更好。此外,我们观察到,随着剪枝率的增加,性能衰减较NN小。最后,我们进行了验证测试,测试了在学习阶段未见过的任务上对模型的测试。在这种情况下,结果表明,与传统NNs相比,SBNNs可以更好地适应新任务,特别是当剪枝率超过80%时。