Lighter and faster image restoration (IR) models are crucial for the deployment on resource-limited devices. Binary neural network (BNN), one of the most promising model compression methods, can dramatically reduce the computations and parameters of full-precision convolutional neural networks (CNN). However, there are different properties between BNN and full-precision CNN, and we can hardly use the experience of designing CNN to develop BNN. In this study, we reconsider components in binary convolution, such as residual connection, BatchNorm, activation function, and structure, for IR tasks. We conduct systematic analyses to explain each component's role in binary convolution and discuss the pitfalls. Specifically, we find that residual connection can reduce the information loss caused by binarization; BatchNorm can solve the value range gap between residual connection and binary convolution; The position of the activation function dramatically affects the performance of BNN. Based on our findings and analyses, we design a simple yet efficient basic binary convolution unit (BBCU). Furthermore, we divide IR networks into four parts and specially design variants of BBCU for each part to explore the benefit of binarizing these parts. We conduct experiments on different IR tasks, and our BBCU significantly outperforms other BNNs and lightweight models, which shows that BBCU can serve as a basic unit for binarized IR networks. All codes and models will be released.
翻译:光亮和更快的图像恢复模型(IR)模型对于在资源有限的设备上部署至关重要。 二进制神经网络(BNN)是最有希望的模型压缩方法之一,它可以大幅降低全精化进化神经网络(CNN)的计算和参数。然而,BNN和全精化神经网络(CNN)之间有不同的属性,我们几乎无法利用设计CNN开发BNNN的经验来开发BNN。在这项研究中,我们重新考虑二进制组合中的组件,如剩余连接、BatchNorm、激活功能和结构等,用于IR任务。我们进行系统分析,以解释每个组件在二进制组合中的角色,并讨论陷阱。具体地说,我们发现剩余连接可以减少二进制神经网络造成的信息损失; BatchNorm 能够解决剩余连接和二进制融合之间的价值范围差距; 启动功能的位置对BNNNE的性能产生极大影响。根据我们的发现和分析,我们设计了一个简单而高效的基本二进制组合单元(BCU)。此外,我们将IR网络分为四个部分,并特别设计BBCBCBCBCBBB的基本模型,用来为BBBBB的每个版本的每个版本。