Keywords: 3D Equivariant Graph Neural Networks, Molecular Structures, Protein Graphs, Representation Learning, Non-Linear Attention Convolution
TL;DR: We implement a simple attention-based 3D equivariant GNN and show its performance on learning tasks involving large molecular structures
Abstract: Learning and reasoning about 3D molecular structures with varying size is an emerging and important challenge in machine learning and especially in the development of biotherapeutics. Equivariant Graph Neural Networks (GNNs) can simultaneously leverage the geometric and relational detail of the problem domain and are known to learn expressive representations through the propagation of information between nodes leveraging higher-order representations to faithfully express the geometry of the data, such as directionality in their intermediate layers. In this work, we propose an equivariant GNN that operates with Cartesian coordinates to incorporate directionality and we implement a novel attention mechanism, acting as a content and spatial dependent filter when propagating information between nodes. Our proposed message function processes vector features in a geometrically meaningful way by mixing existing vectors and creating new ones based on cross products. We demonstrate the efficacy of our architecture on accurately predicting properties of large biomolecules and show its computational advantage over recent methods which rely on irreducible representations by means of the spherical harmonics expansion.
PDF File: pdf
Type Of Submission: Full paper proceedings track submission (max 9 main pages).
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
Type Of Submission: Full paper proceedings track submission.
Poster Preview: png