Recurrent Neural networks (RNN) have shown promising potential for learning dynamics of sequential data. However, artificial neural networks are known to exhibit poor robustness in presence of input noise, where the sequential architecture of RNNs exacerbates the problem. In this paper, we will use ideas from control and estimation theories to propose a tractable robustness analysis for RNN models that are subject to input noise. The variance of the output of the noisy system is adopted as a robustness measure to quantify the impact of noise on learning. It is shown that the robustness measure can be estimated efficiently using linearization techniques. Using these results, we proposed a learning method to enhance robustness of a RNN with respect to exogenous Gaussian noise with known statistics. Our extensive simulations on benchmark problems reveal that our proposed methodology significantly improves robustness of recurrent neural networks.
翻译:经常性神经网络(RNN)显示了学习相继数据动态的有希望的潜力,然而,已知人工神经网络在输入噪音的情况下缺乏稳健性,而输入噪音则使问题更加严重。在本文中,我们将利用控制和估计理论的想法,为受输入噪音影响的RNN模型提出可移植稳健性分析。噪音系统输出的差异被作为一种稳健性措施,以量化噪音对学习的影响。显示使用线性技术可以有效地估计稳健度。我们利用这些结果,提出了一种学习方法,用已知的统计数字来增强RNN对外源高斯噪音的稳健性。我们对基准问题的广泛模拟表明,我们拟议的方法大大改善了经常性神经网络的稳健性。