Emergent Structures and Lifetime Structure Evolution in Artificial Neural NetworksDownload PDF

Published: 02 Oct 2019, Last Modified: 05 May 2023Real Neurons & Hidden Units @ NeurIPS 2019 PosterReaders: Everyone
TL;DR: We introduce a network framework which can modify its structure during training and show that it can converge to various ML network archetypes such as MLPs and LCNs.
Abstract: Motivated by the flexibility of biological neural networks whose connectivity structure changes significantly during their lifetime,we introduce the Unrestricted Recursive Network (URN) and demonstrate that it can exhibit similar flexibility during training via gradient descent. We show empirically that many of the different neural network structures commonly used in practice today (including fully connected, locally connected and residual networks of differ-ent depths and widths) can emerge dynamically from the same URN.These different structures can be derived using gradient descent on a single general loss function where the structure of the data and the relative strengths of various regulator terms determine the structure of the emergent network. We show that this loss function and the regulators arise naturally when considering the symmetries of the network as well as the geometric properties of the input data.
Keywords: emergent networks, structure evolution, architecture search
3 Replies

Loading