Quantizing the floating-point weights and activations of deep convolutional neural networks to fixed-point representation yields reduced memory footprints and inference time. Recently, efforts have been afoot towards zero-shot quantization that does not require original unlabelled training samples of a given task. These best-published works heavily rely on the learned batch normalization (BN) parameters to infer the range of the activations for quantization. In particular, these methods are built upon either empirical estimation framework or the data distillation approach, for computing the range of the activations. However, the performance of such schemes severely degrades when presented with a network that does not accommodate BN layers. In this line of thought, we propose a generalized zero-shot quantization (GZSQ) framework that neither requires original data nor relies on BN layer statistics. We have utilized the data distillation approach and leveraged only the pre-trained weights of the model to estimate enriched data for range calibration of the activations. To the best of our knowledge, this is the first work that utilizes the distribution of the pretrained weights to assist the process of zero-shot quantization. The proposed scheme has significantly outperformed the existing zero-shot works, e.g., an improvement of ~ 33% in classification accuracy for MobileNetV2 and several other models that are w & w/o BN layers, for a variety of tasks. We have also demonstrated the efficacy of the proposed work across multiple open-source quantization frameworks. Importantly, our work is the first attempt towards the post-training zero-shot quantization of futuristic unnormalized deep neural networks.
翻译:将浮点权重和深卷动神经神经网络激活到固定点代表的深度神经网络中,可以减少记忆足迹和推断时间。最近,已经努力向零点量化迈进,不需要某项任务原始的无标签培训样本。这些最佳出版的作品在很大程度上依赖所学的批量正常化(BN)参数来推断启动量的范围。特别是,这些方法建立在经验估计框架或数据蒸馏方法的基础上,用于计算启动范围。然而,当这种机制的性能在显示一个无法容纳 BN 层的网络时会严重下降。在这个思路中,我们提议了一个通用的零点量化(GZSQ)框架,既不需要原始数据,也不依赖 BN 层统计。我们使用了数据蒸馏法的方法,并且仅利用模型的事先考验权重来估计启动范围校准的拟议数据。对于我们的知识而言,这是第一次利用预值网络质量值的未完成的工作,而对于当前零位的平流分析进程来说,B级平面平整的平整过程也协助了现有的零位分析过程。