Binary neural networks (BNNs) show promising utilization in cost and power-restricted domains such as edge devices and mobile systems. This is due to its significantly less computation and storage demand, but at the cost of degraded performance. To close the accuracy gap, in this paper we propose to add a complementary activation function (AF) ahead of the sign based binarization, and rely on the genetic algorithm (GA) to automatically search for the ideal AFs. These AFs can help extract extra information from the input data in the forward pass, while allowing improved gradient approximation in the backward pass. Fifteen novel AFs are identified through our GA-based search, while most of them show improved performance (up to 2.54% on ImageNet) when testing on different datasets and network models. Our method offers a novel approach for designing general and application-specific BNN architecture. Our code is available at http://github.com/flying-Yan/GAAF.
翻译:二进制神经网络(BNNs)显示出在成本和电力限制领域,如边缘装置和移动系统方面大有希望的利用,这是因为其计算和储存需求明显减少,但以降低性能为代价。为了缩小准确性差,我们在本文件中提议在基于信号的二进制之前增加一个补充激活功能(AF),并依靠基因算法自动搜索理想的AFs。这些AFs可以帮助从前传输入数据中提取额外信息,同时改进后传的梯度近似。通过我们基于GA的搜索,确定了15个新的AFs,其中多数在测试不同的数据集和网络模型时显示的性能得到改善(在图像网上高达2.54% )。我们的方法为设计通用和应用专用的BNNF架构提供了一种新颖的方法。我们的代码可在http://github.com/flying-Yan/GAAFF中查阅。