Transformer is a transformative framework that models sequential data and has achieved remarkable performance on a wide range of tasks, but with high computational and energy cost. To improve its efficiency, a popular choice is to compress the models via binarization which constrains the floating-point values into binary ones to save resource consumption owing to cheap bitwise operations significantly. However, existing binarization methods only aim at minimizing the information loss for the input distribution statistically, while ignoring the pairwise similarity modeling at the core of the attention mechanism. To this end, we propose a new binarization paradigm customized to high-dimensional softmax attention via kernelized hashing, called EcoFormer, to map the original queries and keys into low-dimensional binary codes in Hamming space. The kernelized hash functions are learned to match the ground-truth similarity relations extracted from the attention map in a self-supervised way. Based on the equivalence between the inner product of binary codes and the Hamming distance as well as the associative property of matrix multiplication, we can approximate the attention in linear complexity by expressing it as a dot-product of binary codes. Moreover, the compact binary representations of queries and keys enable us to replace most of the expensive multiply-accumulate operations in attention with simple accumulations to save considerable on-chip energy footprint on edge devices. Extensive experiments on both vision and language tasks show that EcoFormer consistently achieves comparable performance with standard attentions while consuming much fewer resources. For example, based on PVTv2-B0 and ImageNet-1K, Ecoformer achieves a 73% energy footprint reduction with only a 0.33% performance drop compared to the standard attention. Code is available at https://github.com/ziplab/EcoFormer.
翻译:变换器是一个转型框架,它可以模拟数据,并在一系列任务中取得了显著的绩效,但计算成本和能源成本都很高。为了提高效率,人们可以选择通过二进制将模型压缩成双进制,将浮动点值限制在二进制值中,以节省资源消耗,因为使用廉价的低劣操作。然而,现有的二进制方法的目的只是将输入分布在统计上的信息损失降到最低程度,而忽略关注机制核心的双向相似模型。为此,我们建议了一个新的双进制模式,通过内层化的仓储(EcoFormer)将高维软体关注量定制化为新的二进制模式,称为EcoFormer(EcoFormer),将原始查询和钥匙键制成的二进制码(IFlickral),同时通过直观性能的运行和硬拷贝(Orvicle)显示一个硬化的硬拷贝性动作。