3D-related inductive biases like translational invariance and rotational equivariance are indispensable to graph neural networks operating on 3D atomistic graphs such as molecules. Inspired by the success of Transformers in various domains, we study how to incorporate these inductive biases into Transformers. In this paper, we present Equiformer, a graph neural network leveraging the strength of Transformer architectures and incorporating $SE(3)/E(3)$-equivariant features based on irreducible representations (irreps). Irreps features encode equivariant information in channel dimensions without complicating graph structures. The simplicity enables us to directly incorporate them by replacing original operations with equivariant counterparts. Moreover, to better adapt Transformers to 3D graphs, we propose a novel equivariant graph attention, which considers both content and geometric information such as relative position contained in irreps features. To improve expressivity of the attention, we replace dot product attention with multi-layer perceptron attention and include non-linear message passing. We benchmark Equiformer on two quantum properties prediction datasets, QM9 and OC20. For QM9, among models trained with the same data partition, Equiformer achieves best results on 11 out of 12 regression tasks. For OC20, under the setting of training with IS2RE data and optionally IS2RS data, Equiformer improves upon state-of-the-art models. Code reproducing all main results will be available soon.
翻译:与 3D 相关的导导偏差, 如翻译变换和旋转等同性等同性, 对以 3D 原子图形( 如分子) 运行的神经网络来说是不可或缺的。 受 3D 原子图形( 如 分子 ) 成功 的启发, 我们研究如何将这些进化偏差纳入变异器 。 在本文中, 我们展示了 3D 相关的导导导神经网络, 利用变异器结构的强度, 并包含 $SE(3)/ E(3)- Q 等异性特征 。 Irrep 功能将 频道维度信息编码为等异性信息, 而不使图形结构结构复杂化。 简单化使我们能够直接将其纳入, 由 3D 变异性对等器替换原始操作。 此外, 我们提出一个新的变异性图形关注点, 利用 变异性 变异性 结构中的数据模型, 以 IM 12 数据 的 AS 数据 模型 10M 和 非线性信息传递 。 在两个量 Q 数据 中, 在 Q9 数据 数据 数据 中, 在 数据 数据 AS 12 数据 数据 数据 数据 中, 的 AS AS ASM AS AS 12 中, 要 AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS AS 2 AS 2 中, 要 AS 2 AS 2 中, 要 AS 2 AS 2 AS 2 AS 2 AS 2 AS 2 3 3 3 3 3 3 3 3 3 3 中 3 3 3 3 3 3 4 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3