GLGENN: A Novel Parameter-Light Equivariant Neural Networks Architecture Based on Clifford Geometric Algebras
TL;DR: New equivariant neural network that balances between expressiveness of geometric algebra-based models and parameter-efficiency.
Abstract: We propose, implement, and compare with competitors a new architecture of equivariant neural networks based on geometric (Clifford) algebras: Generalized Lipschitz Group Equivariant Neural Networks (GLGENN). These networks are equivariant to all pseudo-orthogonal transformations, including rotations and reflections, of a vector space with any non-degenerate or degenerate symmetric bilinear form. We propose a weight-sharing parametrization technique that takes into account the fundamental structures and operations of geometric algebras. Due to this technique, GLGENN architecture is parameter-light and has less tendency to overfitting than baseline equivariant models. GLGENN outperforms or matches competitors on several benchmarking equivariant tasks, including estimation of an equivariant function and a convex hull experiment, while using significantly fewer optimizable parameters.
Lay Summary: Many physical systems, such as robots, molecules, or charged particles, behave the same way even if you rotate or mirror them. Equivariant neural networks are a special class of machine learning models designed to recognize and preserve these kinds of symmetries in data. This makes equivariant models particularly powerful for applications in physics, engineering, biology, computer vision, and other scientific fields, for example, for modeling physical systems, molecules, and other scientific data. However, current equivariant models often require large numbers of trainable parameters, leading to inefficient training times and a tendency to overfit, especially when training data is limited. In our research, we introduce a new equivariant architecture called GLGENN (Generalized Lipschitz Group Equivariant Neural Networks), which achieves symmetry-aware learning with far fewer parameters. We base our design on a well-known mathematical framework called Clifford (geometric) algebras and introduce a novel technique for weight sharing that respects the underlying algebraic structures. We validate our model on multiple benchmark tasks, including physics-inspired simulations and geometric regression, and show that it matches or outperforms existing methods with significantly fewer trainable parameters. GLGENN open the door to more efficient and broadly applicable equivariant models in machine learning.
Link To Code: https://github.com/katyafilimoshina/glgenn
Primary Area: Deep Learning
Keywords: equivariant neural network, geometric deep learning, geometric algebra, Clifford algebra, pseudo-orthogonal groups, Lipschitz groups, spin groups, weight sharing, equivariance
Submission Number: 10749
Loading