Keywords: equivariant machine learning, tensors, orthogonal, lorentz, symplectic
TL;DR: Learn tensors that are equivariant with respect to the orthogonal, Lorentz, and symplectic groups.
Abstract: Tensors are a fundamental data structure for many scientific contexts, such as time series analysis, materials science, and physics, among many others. Improving our ability to produce and handle tensors is essential to efficiently address problems in these domains.
In this paper, we show how to exploit the underlying symmetries of functions that map tensors to tensors. More concretely, we develop universally expressive equivariant machine learning architectures on tensors that exploit that, in many cases, these tensor functions are equivariant with respect to the diagonal action of the orthogonal, Lorentz, and/or symplectic groups.
We showcase our results on three problems coming from material science, theoretical computer science, and time series analysis. For time series, we combine our method with the increasingly popular path signatures approach, which is also invariant with respect to reparameterizations. Our numerical experiments show that our equivariant models perform better than corresponding non-equivariant baselines.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 21014
Loading