The 2017 paper ''Attention Is All You Need'' introduced the Transformer architecture-and inadvertently spawned one of machine learning's most persistent naming conventions. We analyze 717 arXiv preprints containing ''All You Need'' in their titles (2009-2025), finding exponential growth ($R^2$ > 0.994) following the original paper, with 200 titles in 2025 alone. Among papers following the canonical ''X [Is] All You Need'' structure, ''Attention'' remains the most frequently claimed necessity (28 occurrences). Situating this phenomenon within memetic theory, we argue the pattern's success reflects competitive pressures in scientific communication that increasingly favor memorability over precision. Whether this trend represents harmless academic whimsy or symptomatic sensationalism, we leave-with appropriate self-awareness-to the reader.
翻译:2017年的论文《Attention Is All You Need》提出了Transformer架构,并意外地催生了机器学习领域最持久的命名惯例之一。我们分析了717篇标题中包含“All You Need”的arXiv预印本(2009-2025年),发现自原始论文发表后,此类标题呈指数级增长($R^2$ > 0.994),仅2025年就有200篇。在遵循经典“X [Is] All You Need”结构的论文中,“Attention”仍是最常被宣称的必要元素(出现28次)。将这一现象置于迷因理论框架下,我们认为该模式的成功反映了科学传播中日益增长的竞争压力,使得易记性逐渐优先于精确性。无论这一趋势代表的是无害的学术趣味,还是症状性的哗众取宠,我们——带着恰当的自我认知——将其留给读者评判。