Geometric Algebra Attention Networks for Small Point CloudsDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: deep learning, geometric algebra, equivariance, geometric deep learning, rotation equivariance, permutation equivariance, chemistry, physics, biology, attention, point cloud
Abstract: Much of the success of deep learning is drawn from building architectures that properly respect underlying symmetry and structure in the data on which they operate—a set of considerations that have been united under the banner of geometric deep learning. Often problems in the physical sciences deal with relatively small sets of points in two- or three-dimensional space wherein translation, rotation, and permutation equivariance are important or even vital for models to be useful in practice. In this work, we present rotation- and permutation-equivariant architectures for deep learning on these small point clouds, composed of a set of products of terms from the geometric algebra and reductions over those products using an attention mechanism. The geometric algebra provides valuable mathematical structure by which to combine vector, scalar, and other types of geometric inputs in a systematic way to account for rotation invariance or covariance, while attention yields a powerful way to impose permutation equivariance. We demonstrate the usefulness of these architectures by training models to solve sample problems relevant to physics, chemistry, and biology.
One-sentence Summary: We use geometric algebra and attention to build deep learning architectures with rotation and permutation equivariance suitable for many applications in physics, chemistry, and biology.
Supplementary Material: zip
13 Replies

Loading