Recurrent Neural Networks (RNNs) have been widely applied to deal with temporal problems, such as flood forecasting and financial data processing. On the one hand, traditional RNNs models amplify the gradient issue due to the strict time serial dependency, making it difficult to realize a long-term memory function. On the other hand, RNNs cells are highly complex, which will significantly increase computational complexity and cause waste of computational resources during model training. In this paper, an improved Time Feedforward Connections Recurrent Neural Networks (TFC-RNNs) model was first proposed to address the gradient issue. A parallel branch was introduced for the hidden state at time t-2 to be directly transferred to time t without the nonlinear transformation at time t-1. This is effective in improving the long-term dependence of RNNs. Then, a novel cell structure named Single Gate Recurrent Unit (SGRU) was presented. This cell structure can reduce the number of parameters for RNNs cell, consequently reducing the computational complexity. Next, applying SGRU to TFC-RNNs as a new TFC-SGRU model solves the above two difficulties. Finally, the performance of our proposed TFC-SGRU was verified through several experiments in terms of long-term memory and anti-interference capabilities. Experimental results demonstrated that our proposed TFC-SGRU model can capture helpful information with time step 1500 and effectively filter out the noise. The TFC-SGRU model accuracy is better than the LSTM and GRU models regarding language processing ability.
翻译:经常性神经网络(RNN)被广泛用于处理时间问题,如洪水预测和金融数据处理等。一方面,传统的RNN模式由于严格的时间序列依赖性而扩大了梯度问题,这使得难以实现长期记忆功能。另一方面,RNN细胞高度复杂,这将大大增加计算复杂性,在模型培训期间造成计算资源的浪费。在本文件中,改进了时间进化连接(TFC-RNN)模式首先是为了解决梯度问题。一方面,传统的RNN模式由于严格的时间序列依赖性而扩大了梯度问题。一方面,传统的RNNN模式由于严格的时间序列依赖性而扩大了梯度问题,使得难以实现长期内存功能。另一方面,RNNN的新的单元结构将极大地增加计算复杂性并造成计算资源的浪费。首先,将SG-SG-TRU的隐藏状态(T-RU)模型应用SG-TRU的平行分支部分直接转换到时间框架系统(TFS-RRR)的升级能力,最后,我们提议的LFFA-R的模型和G-RFFLF的升级能力将有效解决了我们G-LFS-LFA(G-LFA)长期经验实验的双重经验测试的模拟的测试,最后两个测试结果。