In this paper, we present DevFormer, a novel transformer-based architecture for addressing the complex and computationally demanding problem of hardware design optimization. Despite the demonstrated efficacy of transformers in domains including natural language processing and computer vision, their use in hardware design has been limited by the scarcity of offline data. Our approach addresses this limitation by introducing strong inductive biases such as relative positional embeddings and action-permutation symmetricity that effectively capture the hardware context and enable efficient design optimization with limited offline data. We apply DevFormer to the problem of decoupling capacitor placement and show that it outperforms state-of-the-art methods in both simulated and real hardware, leading to improved performances while reducing the number of components by more than 30%. Finally, we show that our approach achieves promising results in other offline contextual learning-based combinatorial optimization tasks.
翻译:在本文中,我们介绍DevFormer, 这是一种新的基于变压器的架构,用于解决硬件设计优化这一复杂和计算上要求很高的问题。尽管变压器在自然语言处理和计算机视觉等领域表现出了功效,但在硬件设计中的使用却因离线数据稀缺而受到限制。我们的方法通过引入强烈的感应偏差来解决这一局限性,如相对位置嵌入和动作对称性,有效地捕捉硬件环境,利用有限的离线数据实现高效的设计优化。我们将DevFormer应用到电容器脱钩安置问题上,并表明它优于模拟和真实硬件中最先进的方法,导致性能改善,同时将组件数量减少30%以上。最后,我们表明,我们的方法在其他离线背景学习组合优化任务中取得了有希望的成果。