Rethinking SO(3)-equivariance with Bilinear Tensor Networks

11 May 2023 (modified: 12 Dec 2023)Submitted to NeurIPS 2023EveryoneRevisionsBibTeX
Keywords: equivariance, so(3) symmetry, tensor data, physics
TL;DR: We design a modular, equivariant neural architecture that generalizes affine layers to SO(3) representations and exploits expressive bilinear operations, to improve learning on scalar, vector, and tensor-valued data.
Abstract: Many datasets in scientific and engineering applications are comprised of objects which have specific geometric structure. A common example is data which inhabits a representation of the group SO(3) of 3D rotations: scalars, vectors, tensors, etc. One way for a neural network to exploit prior knowledge of this structure is to enforce SO(3)-equivariance throughout its layers, and several such architectures have been proposed. While general methods for handling arbitrary SO(3) representations exist, they computationally intensive and complicated to implement. We show that by judicious symmetry breaking, we can efficiently increase the expressiveness of a network operating only on vector and order-2 tensor representations of SO(2). We demonstrate the method on an important problem from High Energy Physics known as b-tagging, where particle jets originating from b-meson decays must be discriminated from an overwhelming QCD background. In this task, we find that augmenting a standard architecture with our method results in a 2.3× improvement in rejection score.
Submission Number: 15315
Loading