Message passing neural networks have become a method of choice for learning on graphs, in particular the prediction of chemical properties and the acceleration of molecular dynamics studies. While they readily scale to large training data sets, previous approaches have proven to be less data efficient than kernel methods. We identify limitations of invariant representations as a major reason and extend the message passing formulation to rotationally equivariant representations. On this basis, we propose the polarizable atom interaction neural network (PaiNN) and improve on common molecule benchmarks over previous networks, while reducing model size and inference time. We leverage the equivariant atomwise representations obtained by PaiNN for the prediction of tensorial properties. Finally, we apply this to the simulation of molecular spectra, achieving speedups of 4-5 orders of magnitude compared to the electronic structure reference.
翻译:电文传递神经网络已成为在图表上学习的一种选择方法,特别是预测化学特性和加速分子动态研究。虽然它们很容易推广到大型培训数据集中,但以往的方法证明数据效率低于内核方法。我们确定变异表达方式的局限性是一个主要原因,并将电文传递表达方式扩展至旋转式等同表达方式。在此基础上,我们提议建立两极化原子相互作用神经网络(PaiNN),并改进与以往网络相比的共同分子基准,同时减少模型大小和推论时间。我们利用PaiNN获得的等同性原子表达方式预测聚合特性。最后,我们将此应用于分子光谱的模拟,实现与电子结构参考相比4-5级的加速。