Neural Networks Trained by Weight Permutation are Universal Approximators

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Universal approximation property, permutation training, physical neural networks, learning behavior
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: The universal approximation property is fundamental to the success of neural networks, and has traditionally been achieved by networks without any constraints on their parameters. However, recent experimental research proposed an innovative permutation-based training method, which can achieve desired classification performance without modifying the exact values of the weights. In this paper, we prove that the permutation training method can guide a ReLU network to approximate one-dimensional continuous functions. Our numerical results under more diverse scenarios also validate the effectiveness of the permutation training method in regression tasks. Moreover, the notable observations during weight permutation suggest that permutation training can provide a novel tool for describing network learning behavior.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5237
Loading