Large language models (LLMs) show excellent performance but are compute- and memory-intensive. Quantization can reduce memory and accelerate inference. However, for LLMs beyond 100 billion parameters, existing methods cannot maintain accuracy or do not run efficiently on hardware. We propose SmoothQuant, a training-free, accuracy-preserving, and general-purpose post-training quantization (PTQ) solution to enable 8-bit weight, 8-bit activation (W8A8) quantization for LLMs that can be implemented efficiently. We observe that systematic outliers appear at fixed activation channels. Based on the fact that weights are easy to quantize while activations are not, SmoothQuant smooths the activation outliers by offline migrating the quantization difficulty from activations to weights with a mathematically equivalent transformation. SmoothQuant enables an INT8 quantization of both weights and activations for all the GEMMs in LLMs, including OPT-175B, BLOOM-176B, and GLM-130B. SmoothQuant has better hardware efficiency than existing techniques using mixed-precision activation quantization or weight-only quantization. We demonstrate up to 1.56x speedup and 2x memory reduction for LLMs with negligible loss in accuracy. Thanks to the hardware-friendly design, we integrate SmoothQuant into FasterTransformer, a state-of-the-art LLM serving framework, and achieve faster inference speed with half the number of GPUs compared to FP16. Our work offers a turn-key solution that reduces hardware costs and democratizes LLMs. Code is available at: https://github.com/mit-han-lab/smoothquant.
翻译:大型语言模型(LLMS) 表现优异,但可进行计算和记忆密集。 量化可以减少记忆并加速推断。 但是,对于超过1000亿参数的LLMS来说,现有方法无法保持准确性或无法在硬件上有效运行。 我们建议平调Quat, 这是一种无培训、准确性保存和通用的训练后量化(PTQ)解决方案, 以便实现8比重, 8比特激活(W8A8), 并且可以有效实施。 我们观察到, 系统外限出现在固定的启动渠道。 基于以下事实,权重很容易在启动时进行量化, 平调QMs, 平调平和平流平流, 将平流法- 平流法/ 平流/ 平流法 平流法化, 将平流法化的硬化成本降低到平流化。 平流的平流- 平流- 平流-平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平流- 平