In recent years, transformer models have revolutionized Natural Language Processing (NLP) and shown promising performance on Computer Vision (CV) tasks. Despite their effectiveness, transformers' attention operations are hard to accelerate due to the complicated data movement and quadratic computational complexity, prohibiting the real-time inference on resource-constrained edge-computing platforms. To tackle this challenge, we propose Energon, an algorithm-architecture co-design approach that accelerates various transformers using dynamic sparse attention. With the observation that attention results only depend on a few important query-key pairs, we propose a Mix-Precision Multi-Round Filtering (MP-MRF) algorithm to dynamically identify such pairs at runtime. We adopt low bitwidth in each filtering round and only use high-precision tensors in the attention stage to reduce overall complexity. By this means, we significantly mitigate the computational cost with negligible accuracy loss. To enable such an algorithm with lower latency and better energy efficiency, we also propose an Energon co-processor architecture. Elaborated pipelines and specialized optimizations jointly boost the performance and reduce power consumption. Extensive experiments on both NLP and CV benchmarks demonstrate that Energon achieves $168\times$ and $8.7\times$ geo-mean speedup and up to $10^4\times$ and $10^3\times$ energy reduction compared with Intel Xeon 5220 CPU and NVIDIA V100 GPU. Compared to state-of-the-art attention accelerators SpAtten and $A^3$, Energon also achieves $1.7\times, 1.25\times$ speedup and $1.6 \times, 1.5\times $ higher energy efficiency.
翻译:近年来,变压器模型使自然语言处理(NLP)发生革命性革命,并展示了计算机视觉(CV)任务方面有希望的绩效。尽管变压器的功能有效,但由于数据移动复杂和二次计算复杂,变压器的注意力操作难以加快速度,禁止资源紧张的边缘计算平台上实时发酵。为了应对这一挑战,我们提议了Energon,一种算法-建筑共设计方法,用动态微弱的注意力加速各种变压器$动态稀薄的注意力。由于观察到关注结果只取决于少数重要的调心对配对,我们提议采用Mix-精密式多声波过滤(MP-MRF)算法,以便在运行时动态地识别这些配对配对的配对,我们在每个过滤周期采用低位的比特维特度,在关注阶段只使用高精度的加压器来降低总体复杂性。通过这个方法,我们大大降低计算成本,同时降低精度和能效。为了让调低调和高能效的运算法,我们还提议降低 En-cro-cro-crow-cal-lial-lialalal-listeal-listal-listal aal aal sal aal sal sal 和Ial 10-listral 和G-laxxxxxxxxxxxx