An activation function has a significant impact on the efficiency and robustness of the neural networks. As an alternative, we evolved a cutting-edge non-monotonic activation function, Negative Stimulated Hybrid Activation Function (Nish). It acts as a Rectified Linear Unit (ReLU) function for the positive region and a sinus-sigmoidal function for the negative region. In other words, it incorporates a sigmoid and a sine function and gaining new dynamics over classical ReLU. We analyzed the consistency of the Nish for different combinations of essential networks and most common activation functions using on several most popular benchmarks. From the experimental results, we reported that the accuracy rates achieved by the Nish is slightly better than compared to the Mish in classification.
翻译:激活功能对神经网络的效率和稳健性有重大影响。 作为一种替代办法,我们发展了一个尖端的非分子激活功能, 负刺激混合激活功能( Nish ) 。 它作为正区域的一个校正线性单元功能, 和负区域的一个鼻线性- 吸附功能。 换句话说, 它包含一个类比和一个正弦功能, 并获得古典RELU的新动态。 我们用几个最受欢迎的基准分析了尼什族对基本网络的不同组合和最常见的激活功能的一致性。 根据实验结果, 我们报告说,尼什族实现的精确率略好于分类中的米什族。