E(3) Equivariant Scalar Interaction Network

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Equivariant Neural Network, Machine learning for Science, Lie group
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose an equivariant neural network, Scalar Interaction Network (SiNet)
Abstract: Equivariant Graph Neural Networks have demonstrated exceptional performance in modeling geometric data frequently observed in natural science research. The fundamental component of such models is the equivariant operation, which involves operations such as tensor product and scalarization. We present a conceptual framework that unifies the equivariant operations via equivariant basis decomposition. Within this framework, we generalize the idea of replacing the equivariant basis with input features to design efficient equivariant operations capable of modeling different type-$l$ features. To implement this, we propose Scalar Interaction and design an equivariant network, Scalar Interaction Network (SINet), with it. SINet's efficacy extends to efficiently mapping high type-$l$ features while maintaining a complexity $O(L^2)$ with the maximum $L$, representing a significant improvement over the $O(L^6)$ of tensor-product methods. Empirical results demonstrate SINet's capability to model complex quantum systems with high precision and computational efficiency. Its performance is competitive with current state-of-the-art methods in the field, showcasing its potential to advance the modeling of geometric data. This work highlights the potential of scalar interaction as an building block for constructing equivariant networks and opens up new avenues for future exploration in these vital fields.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6952
Loading