Track: Short paper
Keywords: Geometric deep learning, Transformers, Virtual nodes learning, Equivariance
Abstract: Equivariant neural networks can effectively model physical systems by naturally handling the underlying geometric quantities and preserving their symmetries, but scaling them to large geometric data remains challenging. Naive downsampling typically disrupts features’ transformation laws, limiting their applicability in large scale settings. In this work, we propose a scalable equivariant transformer that efficiently processes geometric data in a coarse-grained latent space while preserving E(3) symmetries of the problem. In particular, by building on the Geometric Algebra Transformer (GATr) and PerceiverIO architectures, our method learns equivariant latent tokens which allow us to decouple the processing complexity from the input data representation while maintaining global equivariance.
Presenter: ~Thomas_Hehn1
Submission Number: 23
Loading