Recently, graph neural networks (GNNs) have achieved remarkable performances for quantum mechanical problems. However, a graph convolution can only cover a localized region, and cannot capture long-range interactions of atoms. This behavior is contrary to theoretical interatomic potentials, which is a fundamental limitation of the spatial based GNNs. In this work, we propose a novel attention-based framework for molecular property prediction tasks. We represent a molecular conformation as a discrete atomic sequence combined by atom-atom distance attributes, named Geometry-aware Transformer (GeoT). In particular, we adopt a Transformer architecture, which has been widely used for sequential data. Our proposed model trains sequential representations of molecular graphs based on globally constructed attentions, maintaining all spatial arrangements of atom pairs. Our method does not suffer from cost intensive computations, such as angle calculations. The experimental results on several public benchmarks and visualization maps verified that keeping the long-range interatomic attributes can significantly improve the model predictability.
翻译:最近,图形神经网络(GNNs)在量子机械问题上取得了显著的成绩。然而,图形变异只能覆盖一个局部区域,无法捕捉原子的远程相互作用。这种行为与理论的跨原子潜力背道而驰,而理论的跨原子潜力是基于空间的GNNs的根本限制。在这项工作中,我们建议为分子属性预测任务建立一个新的关注框架。我们代表一种分子一致性,作为一种离散原子序列,由原子原子-原子距离特性(称为地球测量-天体变异器(GeoT))结合。特别是,我们采用了一种变异器结构,该结构被广泛用于连续数据。我们提议的模型根据全球构建的注意对分子图进行顺序显示,维护原子配对的所有空间安排。我们的方法并不因成本密集的计算(如角度计算)而受到影响。若干公共基准的实验结果和可视化地图证实,保持长程的跨原子特性可以大大改进模型的可预测性。