vReLU Activation Functions for Artificial Neural NetworksDownload PDFOpen Website

Published: 01 Jan 2018, Last Modified: 05 Nov 2023ICNC-FSKD 2018Readers: Everyone
Abstract: ReLU (Rectified Linear Units) layers are widely used in all deep learning architectures. To improve ReLU's performance, some ReLU variants have been proposed in the literature. In this paper, we analyze and compare various kinds of ReLU variants with vReLU: a V-shaped ReLU activation function for artificial neural networks. In particular, we compare and test Leaky ReLU (LReLU), Exponential Linear Unit (ELU) and Scaled Exponential Linear Units (SELU) with vReLU on two datasets: MNIST and not-MNIST. We also analyze these ReLU variants' impact on mean and deviation values for deep artificial neural networks. The comparison and analysis show that vReLU activation function has good performance for classification tasks using artificial neural networks.
0 Replies

Loading