Exponential Separations in Symmetric Neural NetworksDownload PDF

Published: 31 Oct 2022, Last Modified: 10 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: deepsets, relational network, self-attention, symmetric function, set-based, separation
TL;DR: Exponential Width Separation between DeepSets and Relational Networks under assumption of analytic activations
Abstract: In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network~\parencite{santoro2017simple} architecture as a natural generalization of the DeepSets~\parencite{zaheer2017deep} architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size $N$ with elements in dimension $D$, which can be efficiently approximated by the former architecture, but provably requires width exponential in $N$ and $D$ for the latter.
Supplementary Material: pdf
10 Replies

Loading