Quantization of Convolutional Neural Networks (CNNs) is a common approach to ease the computational burden involved in the deployment of CNNs, especially on low-resource edge devices. However, fixed-point arithmetic is not natural to the type of computations involved in neural networks. In this work, we explore ways to improve quantized CNNs using PDE-based perspective and analysis. First, we harness the total variation (TV) approach to apply edge-aware smoothing to the feature maps throughout the network. This aims to reduce outliers in the distribution of values and promote piece-wise constant maps, which are more suitable for quantization. Secondly, we consider symmetric and stable variants of common CNNs for image classification, and Graph Convolutional Networks (GCNs) for graph node-classification. We demonstrate through several experiments that the property of forward stability preserves the action of a network under different quantization rates. As a result, stable quantized networks behave similarly to their non-quantized counterparts even though they rely on fewer parameters. We also find that at times, stability even aids in improving accuracy. These properties are of particular interest for sensitive, resource-constrained, low-power or real-time applications like autonomous driving.
翻译:革命神经网络(CNNs)的量化是一种常见的方法,可以减轻部署CNN的计算负担,特别是用于低资源边缘装置。然而,固定点算数对于神经网络的计算类型并非自然的。在这项工作中,我们探索如何利用基于PDE的视角和分析改进量化的CNN。首先,我们利用全变式(TV)方法,在整个网络中对地貌图平滑。这旨在减少值分布的外端,并推广更适合量化的节奏恒定地图。第二,我们考虑普通CNN图像分类的对称性和稳定性变异,以及图表革命网络(GCNs)的图形节级化。我们通过若干实验表明,远变特性保留了不同量化率下网络的动作。因此,稳定的网位与非量化对应方的网络表现相似,尽管它们依赖较少的参数。我们发现,共同CNN的图像分类和图形变异性(GCNs)的对等称性和稳定性变异性(GCNs Convolutionalal Nets) 也像时间稳定性、特别稳定性能的精确性。