Logarithmic Linear Units (LogLUs): A Novel Activation Function for Improved Convergence in Deep Neural Networks

ICLR 2025 Conference Submission13583 Authors

28 Sept 2024 (modified: 13 Oct 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Activation Function, Deep Neural Networks, Optimisation
TL;DR: The Logarithmic Linear Unit (LogLU) is a novel activation function designed for deep neural networks, improving convergence speed, stability, and overall model performance
Abstract: The Logarithmic Linear Unit (LogLU) presents a novel activation function for deep neural networks by incorporating logarithmic elements into its design, introducing non-linearity that significantly enhances both training efficiency and accuracy. LogLU effectively addresses common limitations associated with widely used activation functions include ReLU, Leaky ReLU, and ELU, which suffer from issues like the dead neuron problem and vanishing gradients. By enabling neurons to remain active with negative inputs and ensuring effective gradient flow during backpropagation, LogLU promotes more efficient convergence in gradient descent. Its capability to solve fundamental yet complex non-linear tasks, such as the XOR problem, with fewer neurons demonstrates its efficiency in capturing non-linear patterns. Extensive evaluations on benchmark datasets like Caltech 101 and Imagenette, using the InceptionV3 architecture, reveal that LogLU not only accelerates convergence but also enhances model performance compared to existing activation functions. These findings underscore LogLU's potential as an effective activation function that improves both model performance and faster convergence.
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13583
Loading