Energy and data-efficient online time series prediction for predicting evolving dynamical systems are critical in several fields, especially edge AI applications that need to update continuously based on streaming data. However, current DNN-based supervised online learning models require a large amount of training data and cannot quickly adapt when the underlying system changes. Moreover, these models require continuous retraining with incoming data making them highly inefficient. To solve these issues, we present a novel Continuous Learning-based Unsupervised Recurrent Spiking Neural Network Model (CLURSNN), trained with spike timing dependent plasticity (STDP). CLURSNN makes online predictions by reconstructing the underlying dynamical system using Random Delay Embedding by measuring the membrane potential of neurons in the recurrent layer of the RSNN with the highest betweenness centrality. We also use topological data analysis to propose a novel methodology using the Wasserstein Distance between the persistence homologies of the predicted and observed time series as a loss function. We show that the proposed online time series prediction methodology outperforms state-of-the-art DNN models when predicting an evolving Lorenz63 dynamical system.
翻译:能源和数据有效的在线时间序列预测对于预测不断变化的动力系统在许多领域,特别是需要基于流数据不断更新的边缘AI应用程序中非常重要。然而,当前的DNN基于监督学习的在线学习模型需要大量的训练数据,不能快速适应基础系统的变化。此外,这些模型需要不断使用传入的数据进行重新训练,使其高度低效。为了解决这些问题,我们提出了一种全新的基于连续学习的无监督递归神经网络模型(CLURSNN),使用SPIke TimIng-Dependent Plasticity(STDP)进行训练。CLURSNN通过使用具有最高介数中心性的RSNN中神经元的膜电位来通过随机延迟嵌入重构基本动力系统进行在线预测。我们还使用拓扑数据分析提出了一种新的方法,使用预测和观察到时间序列的持久同调之间的Wassertein距离作为损失函数。我们展示了所提出的在线时间序列预测方法在预测演化的Lorenz63动力系统时优于最先进的DNN模型。