We present a new model of neural networks called Min-Max-Plus Neural Networks (MMP-NNs) based on operations in tropical arithmetic. In general, an MMP-NN is composed of three types of alternately stacked layers, namely linear layers, min-plus layers and max-plus layers. Specifically, the latter two types of layers constitute the nonlinear part of the network which is trainable and more sophisticated compared to the nonlinear part of conventional neural networks. In addition, we show that with higher capability of nonlinearity expression, MMP-NNs are universal approximators of continuous functions, even when the number of multiplication operations is tremendously reduced (possibly to none in certain extreme cases). Furthermore, we formulate the backpropagation algorithm in the training process of MMP-NNs and introduce an algorithm of normalization to improve the rate of convergence in training.
翻译:我们提出了一个新的神经网络模式,称为Min-Max-Plus神经网络(MMP-NNS),以热带算术作业为基础;一般而言,MMP-NN是由三种交叠的层组成的,即线性层、微增层和最大层;具体地说,后两类层构成网络的非线性部分,与常规神经网络的非线性部分相比,这种非线性部分是可培训和更先进的;此外,我们表明,随着非线性表达能力的提高,MMP-NNNS是连续功能的通用近似器,即使倍增作业的数量大大减少(在某些极端情况下可能为零);此外,我们还在MMP-NS的培训过程中制定反向调整算法,并引入一种正常化算法,以提高培训的趋同率。