The goal of this paper is to provide sufficient conditions for guaranteeing the Input-to-State Stability (ISS) and the Incremental Input-to-State Stability ({\delta}ISS) of Gated Recurrent Units (GRUs) neural networks. These conditions, devised for both single-layer and multi-layer architectures, consist of nonlinear inequalities on network's weights. They can be employed to check the stability of trained networks, or can be enforced as constraints during the training procedure of a GRU. The resulting training procedure is tested on a Quadruple Tank nonlinear benchmark system, showing satisfactory modeling performances.
翻译:本文件的目的是为保证Gated经常单元神经网络的投入对国家稳定以及递增投入对国家稳定提供充分的条件,这些条件是为单层和多层结构设计的,包括网络重量的非线性不平等,可以用来检查受过训练的网络的稳定性,或者在GRU的培训过程中可以作为制约因素加以执行。 由此产生的培训程序在四轮式坦克非线性基准系统上进行测试,显示令人满意的模型性能。