Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic GraphsDownload PDF

16 May 2022 (modified: 12 Mar 2024)NeurIPS 2022 SubmittedReaders: Everyone
Keywords: equivariant neural networks, graph neural networks, computational physics, transformer networks
TL;DR: We propose a new equivariant graph neural network, which incorporates Transformer and message passing and achieves state-of-the-art results on quantum properties prediction datasets.
Abstract: 3D-related inductive biases like translational invariance and rotational equivariance are indispensable to graph neural networks operating on 3D atomistic graphs such as molecules. Inspired by the success of Transformers in various domains, we study how to incorporate these inductive biases into Transformers. In this paper, we present Equiformer, a graph neural network leveraging the strength of Transformer architectures and incorporating SE(3)/E(3)-equivariant features based on irreducible representations (irreps). Irreps features encode equivariant information in channel dimensions without complicating graph structures. The simplicity enables us to directly incorporate them by replacing original operations with equivariant counterparts. Moreover, to better adapt Transformers to 3D graphs, we propose a novel equivariant graph attention, which considers both content and geometric information such as relative position contained in irreps features. To improve expressivity of the attention, we replace dot product attention with multi-layer perceptron attention and include non-linear message passing. We benchmark Equiformer on two quantum properties prediction datasets, QM9 and OC20. For QM9, among models trained with the same data partition, Equiformer achieves best results on 11 out of 12 regression tasks. For OC20, under the same setting of training with IS2RE data only, Equiformer improves upon state-of-the-art models.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2206.11990/code)
27 Replies

Loading