Activation functions play a crucial role in the performance and stability of neural networks. In this study, we propose a novel non-monotonic activation function is called Negative Stimulated Hybrid Activation Function (Nish). It behaves like a Rectified Linear Unit (ReLU) function for values greater than zero, and a sinus-sigmoidal function for values less than zero. The proposed function incorporates the sigmoid and sine wave, allowing new dynamics over traditional ReLU activations. We evaluate robustness of the Nish for different combinations of well-established architectures as well as recently proposed activation functions using on various well-known benchmarks. The results indicate that the accuracy rates obtained by the proposed activation function are slightly higher than those obtained using the set of weights calculated by Mish activation.
翻译:激活功能在神经网络的性能和稳定性方面发挥着关键作用。 在这项研究中, 我们提议了一个新的非分子激活功能, 称为负振动混合激活功能( Nish ) 。 它表现得像一个数值大于零的校正线性单位( ReLU) 函数, 值小于零的正弦- 模拟函数。 拟议的功能包含类固和正弦波, 允许传统的RELU激活产生新的动态。 我们用各种已知基准来评估成熟建筑的不同组合以及最近提议的激活功能的坚固度。 结果表明, 拟议的激活功能所获得的精确率略高于使用由Mish激活计算的权重组合获得的精确率 。