Vector Quantization (VQ) is an appealing model compression method to obtain a tiny model with less accuracy loss. While methods to obtain better codebooks and codes under fixed clustering dimensionality have been extensively studied, optimizations of the vectors in favour of clustering performance are not carefully considered, especially via the reduction of vector dimensionality. This paper reports our recent progress on the combination of dimensionality compression and vector quantization, proposing a Low-Rank Representation Vector Quantization ($\text{LR}^2\text{VQ}$) method that outperforms previous VQ algorithms in various tasks and architectures. $\text{LR}^2\text{VQ}$ joins low-rank representation with subvector clustering to construct a new kind of building block that is directly optimized through end-to-end training over the task loss. Our proposed design pattern introduces three hyper-parameters, the number of clusters $k$, the size of subvectors $m$ and the clustering dimensionality $\tilde{d}$. In our method, the compression ratio could be directly controlled by $m$, and the final accuracy is solely determined by $\tilde{d}$. We recognize $\tilde{d}$ as a trade-off between low-rank approximation error and clustering error and carry out both theoretical analysis and experimental observations that empower the estimation of the proper $\tilde{d}$ before fine-tunning. With a proper $\tilde{d}$, we evaluate $\text{LR}^2\text{VQ}$ with ResNet-18/ResNet-50 on ImageNet classification datasets, achieving 2.8\%/1.0\% top-1 accuracy improvements over the current state-of-the-art VQ-based compression algorithms with 43$\times$/31$\times$ compression factor.
翻译:矢量定量 (VQ) 是一种吸引人的模型压缩方法, 以获得一个小的模型, 其精确度损失较低。 虽然在固定组群维度下获取更好的代码和代码的方法已经得到广泛研究, 但没有仔细考虑优化矢量, 特别是通过减少矢量维度维度。 本文报告了我们最近在将维度压缩和矢量量化相结合方面取得的进展, 提出了一个低分辨率代表量量化( text{ LR2{ text{ V ⁇ }) 的方法, 该方法比以前在各种任务和结构中以美元计算的 VQ 。 $\ text{ net{ ltext{ V} $lickr 的代码和代码更佳的代码更佳。 $de dede right 和子组群集的低级别代表, 通过对任务损失的端到端培训, 直接优化。 我们的拟议设计模式引入了3个超分辨率计, 基数美元, 基数的大小, 基数 基数的大小, 和基数 美元 的基数 。