Equivariant Neural Operator Learning with Graphon Convolution

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: Neural Operator Learning, Spectral Graph Theory, Graphon
TL;DR: We propose InfGCN, an equivariant neural operator learning architecture which can be interpreted as graphon convolution. InfGCN achieved SOTA performance across several electron density datasets.
Abstract: We propose a general architecture that combines the coefficient learning scheme with a residual operator layer for learning mappings between continuous functions in the 3D Euclidean space. Our proposed model is guaranteed to achieve SE(3)-equivariance by design. From the graph spectrum view, our method can be interpreted as convolution on graphons (dense graphs with infinitely many nodes), which we term InfGCN. By leveraging both the continuous graphon structure and the discrete graph structure of the input data, our model can effectively capture the geometric information while preserving equivariance. Through extensive experiments on large-scale electron density datasets, we observed that our model significantly outperformed the current state-of-the-art architectures. Multiple ablation studies were also carried out to demonstrate the effectiveness of the proposed architecture.
Supplementary Material: pdf
Submission Number: 5672
Loading