Abstract: We present GPS++, a hybrid Message Passing Neural Network / Graph Transformer model for molecular property prediction. Our model integrates a well-tuned local message passing component and biased global attention with other key ideas from prior literature to achieve state-of-the-art results on large-scale molecular dataset PCQM4Mv2. Through a thorough ablation study we highlight the impact of individual components and find that nearly all of the model’s performance can be maintained without any use of global self-attention, showing that message passing is still a competitive approach for 3D molecular property prediction despite the recent dominance of graph transformers. We also find that our approach is significantly more accurate than prior art when 3D positional information is not available.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/graphcore/ogb-lsc-pcqm4mv2
Supplementary Material: pdf
Assigned Action Editor: ~Ole_Winther1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1156
Loading