Composing Global Optimizers to Reasoning Tasks via Algebraic Objects in Neural Nets

Published: 11 Oct 2024, Last Modified: 10 Nov 2024M3L PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: landscape analysis, modular addition; gradient dynamics; reasoning; symmetry; representation learning
TL;DR: Semi-ring structure exists in 2-layer neural nets for reasoning tasks on Abelian group (e.g., modular addition), trained with L2 loss, which enables constructing global solutions analytically from non-optimal ones instead of gradient descent.
Abstract: We prove rich algebraic structures of the solution space for 2-layer neural networks with quadratic activation and loss, trained on reasoning tasks in Abelian group (e.g., modular addition). Such a rich structure enables \emph{analytical} construction of global optimal solutions from partial solutions that only satisfy part of the loss, despite its high nonlinearity. We coin the framework as \ours{} (\emph{\underline{Co}mposing \underline{G}lobal \underline{O}ptimizers}). Specifically, we show that the weight space over different numbers of hidden nodes of the 2-layer network is equipped with a semi-ring algebraic structure, and the loss function to be optimized consists of \emph{monomial potentials}, which are ring homomorphism, allowing partial solutions to be composed into global ones by ring addition and multiplication. Our experiments show that around 95% of the solutions obtained by gradient descent match exactly our theoretical constructions. Although the global optimizers constructed only required a small number of hidden nodes, our analysis on gradient dynamics shows that overparameterization asymptotically decouples training dynamics and is beneficial. We further show that training dynamics favors simpler solutions under weight decay, and thus high-order global optimizers such as perfect memorization are unfavorable.
Is Neurips Submission: No
Submission Number: 101
Loading