Binarized neural networks, or BNNs, show great promise in edge-side applications with resource limited hardware, but raise the concerns of reduced accuracy. Motivated by the complex neural networks, in this paper we introduce complex representation into the BNNs and propose Binary complex neural network -- a novel network design that processes binary complex inputs and weights through complex convolution, but still can harvest the extraordinary computation efficiency of BNNs. To ensure fast convergence rate, we propose novel BCNN based batch normalization function and weight initialization function. Experimental results on Cifar10 and ImageNet using state-of-the-art network models (e.g., ResNet, ResNetE and NIN) show that BCNN can achieve better accuracy compared to the original BNN models. BCNN improves BNN by strengthening its learning capability through complex representation and extending its applicability to complex-valued input data. The source code of BCNN will be released on GitHub.
翻译:催化神经网络,即BNN,在资源有限的硬件的边缘应用中表现出巨大的希望,但引起了对精确度降低的担忧。在复杂的神经网络的推动下,我们在本文件中将复杂的代表引入BNN,并提出Binary复杂的神经网络 -- -- 这是一种新颖的网络设计,通过复杂的演化处理复杂的复杂投入和重量,但仍可以收获BNN的特殊计算效率。为了确保快速趋同率,我们提议新的基于BNN的批量正常化功能和重量初始功能。Cifar10和图像网络使用最新网络模型(例如ResNet、ResNetE和NIN)的实验结果显示,CindNN可以比原的BNN模型更精确。BNN通过复杂的代表及其对复杂投入数据的应用,加强BNN的学习能力,从而改进BNN。BNN的源码将在GitHub上发布。