In this work, we study the binary neural networks (BNNs) of which both the weights and activations are binary (i.e., 1-bit representation). Feature representation is critical for deep neural networks, while in BNNs, the features only differ in signs. Prior work introduces scaling factors into binary weights and activations to reduce the quantization error and effectively improves the classification accuracy of BNNs. However, the scaling factors not only increase the computational complexity of networks, but also make no sense to the signs of binary features. To this end, Self-Distribution Binary Neural Network (SD-BNN) is proposed. Firstly, we utilize Activation Self Distribution (ASD) to adaptively adjust the sign distribution of activations, thereby improve the sign differences of the outputs of the convolution. Secondly, we adjust the sign distribution of weights through Weight Self Distribution (WSD) and then fine-tune the sign distribution of the outputs of the convolution. Extensive experiments on CIFAR-10 and ImageNet datasets with various network structures show that the proposed SD-BNN consistently outperforms the state-of-the-art (SOTA) BNNs (e.g., achieves 92.5% on CIFAR-10 and 66.5% on ImageNet with ResNet-18) with less computation cost. Code is available at https://github.com/ pingxue-hfut/SD-BNN.
翻译:在这项工作中,我们研究的是二进制神经网络(BNNs),其重量和激活是二进制的(即,1比位表示),特征表示对深神经网络至关重要,而在BNNs中,特征只在标志上有所不同。先前的工作将缩放因子引入二进制加权和激活,以减少量化错误,并有效地提高BNNs的分类准确性。但是,缩放因子不仅增加了网络的计算复杂性,而且对二进制特征的迹象也没有任何意义。为此,我们提议了自发二进制神经网络(SD-2.5比方表示)。首先,我们利用自发自发(ASD)来调整激活的符号分布,从而改善卷动输出的符号差异。第二,我们通过Wight自发(WSCD)调整加权的符号分布,然后微调调混成混音器。关于 CIRFAR-10和图像网络结构图像网的大规模实验(SD-NFNF-NF)显示S-NF-C-C-CSO-C-C-C-CNS-CR_Bs-CFAR_Bx_Bxxxxx 的计算法在SDM-CFAR_Bx_Bx_Bxxxxxxx 的计算中,在SD-NFAR_B_B_B_B_B_B_B_B_B_BAR_B_B_B_B_B_B_Bx_B_Bx的计算中,在SDRAS_xxxxxxxxxxxxxxxx_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx