Equivariant Transformers for Neural Network based Molecular PotentialsDownload PDF

29 Sept 2021, 00:30 (edited 09 Mar 2022)ICLR 2022 SpotlightReaders: Everyone
  • Keywords: Molecular Modeling, Quantum Chemistry, Attention, Transformers
  • Abstract: The prediction of quantum mechanical properties is historically plagued by a trade-off between accuracy and speed. Machine learning potentials have previously shown great success in this domain, reaching increasingly better accuracy while maintaining computational efficiency comparable with classical force fields. In this work we propose TorchMD-NET, a novel equivariant Transformer (ET) architecture, outperforming state-of-the-art on MD17, ANI-1, and many QM9 targets in both accuracy and computational efficiency. Through an extensive attention weight analysis, we gain valuable insights into the black box predictor and show differences in the learned representation of conformers versus conformations sampled from molecular dynamics or normal modes. Furthermore, we highlight the importance of datasets including off-equilibrium conformations for the evaluation of molecular potentials.
  • One-sentence Summary: We propose a novel equivariant Transformer architecture for the prediction of molecular potentials and provide insights into the molecular representation through extensive analysis of the model's attention weights.
9 Replies

Loading