Activation Function: Absolute Function,One Function Behaves more IndividualizedDownload PDF

Published: 01 Feb 2023, Last Modified: 13 Feb 2023Submitted to ICLR 2023Readers: Everyone
Keywords: activation function, absolute function, individualization, universality, over-fitting, Z-Score, abstract network, concrete network,stimulation
TL;DR: A new activition function
Abstract: Inspire by nature world mode, a activation function is proposed. It is absolute function.Through test on mnist dataset and fully-connected neural network and convolutional neural network, some conclusions are put forward. The line of accuracy of absolute function is a little shaken that is different from the line of accuracy of relu and leaky relu. The absolute function can keep the negative parts as equal as the positive parts, so the individualization is more active than relu and leaky relu function. The absolute function is less likely to be over-fitting. Through test on mnist and autoencoder, It is that the leaky relu function can do classification task well, while the absolute function can do generation task well. Because the classification task need more universality and generation task need more individualization. The pleasure irritation and painful irritation is not only the magnitude differences, but also the sign differences, so the negative parts should keep as a part. Stimulation which happens frequently is low value, it is showed around zero in figure 1 . Stimulation which happens accidentally is high value, it is showed far away from zero in figure 1. So the high value is the big stimulation, which is individualization.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
5 Replies

Loading