We investigate online nonlinear regression with continually running recurrent neural network networks (RNNs), i.e., RNN-based online learning. For RNN-based online learning, we introduce an efficient first-order training algorithm that theoretically guarantees to converge to the optimum network parameters. Our algorithm is truly online such that it does not make any assumption on the learning environment to guarantee convergence. Through numerical simulations, we verify our theoretical results and illustrate significant performance improvements achieved by our algorithm with respect to the state-of-the-art RNN training methods.
翻译:我们通过持续运行经常性神经网络网络(RNN),即基于RNN的在线学习,来调查在线非线性回归。对于基于RNN的在线学习,我们引入了高效的一阶培训算法,从理论上保证与最佳网络参数接轨。我们的算法确实是在线的,因此无法对学习环境做出任何假设来保证趋同。通过数字模拟,我们验证了我们的理论结果,并展示了我们在最先进的RNN培训方法方面的算法所取得的重大绩效改进。