Nonparametric Neural NetworksDownload PDF

Published: 06 Feb 2017, Last Modified: 05 May 2023ICLR 2017 PosterReaders: Everyone
Abstract: Automatically determining the optimal size of a neural network for a given task without prior information currently requires an expensive global search and training many networks from scratch. In this paper, we address the problem of automatically finding a good network size during a single training cycle. We introduce {\it nonparametric neural networks}, a non-probabilistic framework for conducting optimization over all possible network sizes and prove its soundness when network growth is limited via an $\ell_p$ penalty. We train networks under this framework by continuously adding new units while eliminating redundant units via an $\ell_2$ penalty. We employ a novel optimization algorithm, which we term ``Adaptive Radial-Angular Gradient Descent'' or {\it AdaRad}, and obtain promising results.
TL;DR: We automatically set the size of an MLP by adding and removing units during training as appropriate.
Conflicts: cmu.edu
Keywords: Deep learning, Supervised Learning
18 Replies

Loading