This paper proposes a novel nonlinear activation mechanism typically for convolutional neural network (CNN), named as reborn mechanism. In sharp contrast to ReLU which cuts off the negative phase value, the reborn mechanism enjoys the capacity to reborn and reconstruct dead neurons. Compared to other improved ReLU functions, reborn mechanism introduces a more proper way to utilize the negative phase information. Extensive experiments validate that this activation mechanism is able to enhance the model representation ability more significantly and make the better use of the input data information while maintaining the advantages of the original ReLU function. Moreover, reborn mechanism enables a non-symmetry that is hardly achieved by traditional CNNs and can act as a channel compensation method, offering competitive or even better performance but with fewer learned parameters than traditional methods. Reborn mechanism was tested on various benchmark datasets, all obtaining better performance than previous nonlinear activation functions.
翻译:本文提出了一个新型的非线性激活机制,通常被命名为再生机制,用于革命性神经网络(CNN),与削减负阶段值的RELU形成鲜明对比,再生机制拥有重生和重建死神经元的能力。与其他经改进的RELU功能相比,再生机制引入了利用负阶段信息的更适当方法。广泛的实验证实,这一激活机制能够更显著地增强模型代表能力,更好地利用输入数据信息,同时保持原ReLU功能的优势。此外,再生机制使得非对称得以实现,而传统CNN几乎无法实现,可以作为一种渠道补偿方法,提供竞争性或更好的性能,但与传统方法相比,学习的参数更少。再生机制在各种基准数据集上进行了测试,其性能都优于以前的非线性激活功能。