Convolutional neural networks learns spatial features and are heavily interlinked within kernels. The SE module have broken the traditional route of neural networks passing the entire result to next layer. Instead SE only passes important features to be learned with its squeeze and excitation (SE) module. We propose variations of the SE module which improvises the process of squeeze and excitation and enhances the performance. The proposed squeezing or exciting the layer makes it possible for having a smooth transition of layer weights. These proposed variations also retain the characteristics of SE module. The experimented results are carried out on residual networks and the results are tabulated.
翻译:卷积神经网络学习空间特征,并且在核内部紧密相连。SE模块打破了神经网络传递整个结果到下一层的传统路线。相反,SE仅传递重要的特征,以通过其Squeeze和Excitation(SE)模块进行学习。我们提出了SE模块的变种,这些变种改进了Squeeze和Excitation的过程,并增强了性能。所提出的挤压或激励层使得具有平滑的层权重的过渡成为可能。这些提出的变化还保留了SE模块的特性。实验结果在残差网络上进行,并列出了结果。