We introduce EdgeFormer -- a parameter-efficient Transformer for on-device seq2seq generation under the strict computation and memory constraints. Compared with the previous parameter-efficient Transformers, EdgeFormer applies two novel principles for cost-effective parameterization, allowing it to perform better given the same parameter budget; moreover, EdgeFormer is further enhanced by layer adaptation innovation that is proposed for improving the network with shared layers. Extensive experiments show EdgeFormer can effectively outperform previous parameter-efficient Transformer baselines and achieve competitive results under both the computation and memory constraints. Given the promising results, we release EdgeLM -- the pretrained version of EdgeFormer, which is the first publicly available pretrained on-device seq2seq model that can be easily fine-tuned for seq2seq tasks with strong results, facilitating on-device seq2seq generation in practice.
翻译:我们引入了边缘Former -- -- 在严格的计算和内存限制下,对于在设计后继2当量的生成,这是一种具有参数效率的变异器。与以前的具有参数效率的变异器相比,EdgeFormer对具有成本效益的参数化应用了两项新颖原则,根据相同的参数预算,它可以更好地发挥作用;此外,为改进共享层的网络而提议的分层适应创新进一步强化了边缘Former。广泛的实验显示,EdgeFormer可以有效地超过以前的具有参数效率的变异器基线,在计算和记忆限制下都能够取得竞争性结果。鉴于有希望的结果,我们释放了EdgeLM -- -- 预培训的EdgeFormer版本,这是第一个在设计后继2当量模型上经过公开培训的、可以很容易微调的后继值任务,在实际中便利在设计后继后继2当量的生成。