Keywords: Equivariance, Invariance, Geometric Deep Learning, Group-Equivariant Convolutional Neural Networks, Orbits
TL;DR: We present group invariant global pooling, an expressive pooling layer that uses orbits to build invariant representations
Abstract: Much work has been devoted to devising architectures that build group-equivariant representations, while invariance is often induced using simple global pooling mechanisms. Little work has been done on creating expressive layers that are invariant to given symmetries, despite the success of permutation invariant pooling in various tasks. In this work, we present Group Invariant Global Pooling (GIGP), an invariant pooling layer that is provably sufficiently expressive to represent a large class of invariant functions. We validate GIGP on rotated MNIST and QM9, showing improvements for the latter while attaining identical results for the former. By making the pooling process over group orbits, this invariant aggregation method leads to improved performance, while performing well-principled group aggregations.
Supplementary Materials: pdf
Type Of Submission: Extended Abstract (4 pages, non-archival)
Submission Number: 71
Loading