Abstract: We propose the Gaussian Error Linear Unit (GELU), a high-performing neural network activation function. The GELU nonlinearity is the expected transformation of a stochastic regularizer which randomly applies the identity or zero map to a neuron's input. This stochastic regularizer is comparable to nonlinearities aided by dropout, but it removes the need for a traditional nonlinearity. The connection between the GELU and the stochastic regularizer suggests a new probabilistic understanding of nonlinearities. We perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations and find performance improvements across all tasks.
TL;DR: A Competitor of ReLUs and ELUs with a Probabilistic Underpinning
Conflicts: uchicago.edu, ttic.edu
Community Implementations: [ 5 code implementations](https://www.catalyzex.com/paper/bridging-nonlinearities-and-stochastic/code)
11 Replies
Loading