In this paper, we present a recurrent neural system named Long Short-term Cognitive Networks (LSTCNs) as a generalization of the Short-term Cognitive Network (STCN) model. Such a generalization is motivated by the difficulty of forecasting very long time series efficiently. The LSTCN model can be defined as a collection of STCN blocks, each processing a specific time patch of the (multivariate) time series being modeled. In this neural ensemble, each block passes information to the subsequent one in the form of weight matrices representing the prior knowledge. As a second contribution, we propose a deterministic learning algorithm to compute the learnable weights while preserving the prior knowledge resulting from previous learning processes. As a third contribution, we introduce a feature influence score as a proxy to explain the forecasting process in multivariate time series. The simulations using three case studies show that our neural system reports small forecasting errors while being significantly faster than state-of-the-art recurrent models.
翻译:在本文中,我们提出了一个名为长期短期认知网络(LSTCNs)的经常性神经系统,作为短期认知网络(STCNs)模型的概括。这种概括化的动机是难以有效预测非常长的时间序列。LSTCN模型可以定义为STCN区块的集合,每个处理模型中(多变)时间序列的一个特定时间段。在这个神经共性中,每个区块将信息以代表先前知识的重量矩阵的形式传递给下一个区块。作为第二个贡献,我们建议一种确定性学习算法,在计算可学习的重量的同时保留先前学习过程产生的知识。作为第三个贡献,我们引入特征影响评分,作为在多变时间序列中解释预测过程的代号。使用三个案例研究进行的模拟表明,我们的神经系统报告了小的预测错误,同时大大快于最先进的经常模型。