In this paper, the use of third-generation machine learning, also known as spiking neural network architecture, for continuous learning was investigated and compared to conventional models. The experimentation was divided into three separate phases. The first phase focused on training the conventional models via transfer learning. The second phase trains a Nengo model from their library. Lastly, each conventional model is converted into a spiking neural network and trained. Initial results from phase 1 are inline with known knowledge about continuous learning within current machine learning literature. All models were able to correctly identify the current classes, but they would immediately see a sharp performance drop in previous classes due to catastrophic forgetting. However, the SNN models were able to retain some information about previous classes. Although many of the previous classes were still identified as the current trained classes, the output probabilities showed a higher than normal value to the actual class. This indicates that the SNN models do have potential to overcome catastrophic forgetting but much work is still needed.
翻译:暂无翻译