Keywords: ReLU, deep learning, optimization, parameterization, normalization, neural network, training dynamics
TL;DR: We identify a training instability issue for common neural network parameterizations and normalizations and propose a novel Geometric Parameterization to fix it.
Abstract: We introduce a novel approach for analyzing the training dynamics of ReLU networks by examining the characteristic activation boundaries of individual ReLU neurons. Our proposed analysis reveals a critical instability in common neural network parameterizations and normalizations during stochastic optimization, which impedes fast convergence and hurts generalization performance. Addressing this, we propose Geometric Parameterization (GmP), a novel neural network parameterization technique that effectively separates the radial and angular components of weights in the hyperspherical coordinate system. We show theoretically that GmP resolves the aforementioned instability issue. We report empirical results on various models and benchmarks to verify GmP's advantages of optimization stability, convergence speed and generalization performance.
Primary Area: Optimization for deep networks
Submission Number: 15315
Loading