Symmetric Perceptron with Random LabelsDownload PDF

Published: 21 May 2023, Last Modified: 25 Aug 2023SampTA 2023 PaperReaders: Everyone
Abstract: The symmetric binary perceptron (SBP) is a random constraint satisfaction problem (CSP) and a single-layer neural network; it exhibits intriguing features, most notably a sharp phase transition regarding the existence of satisfying solutions. In this paper, we propose two novel generalizations of the SBP by incorporating random labels. Our proposals admit a natural machine learning interpretation: any satisfying solution to the random CSP is a minimizer of a certain empirical risk. We establish that the expected number of solutions for both models undergoes a sharp phase transition and calculate the location of this transition, which corresponds to the annealed capacity in statistical physics. We then establish a universality result: the location of this transition does not depend on the underlying distribution. We conjecture that both models in fact exhibit an even stronger phase transition akin to the SBP and give rigorous evidence towards this conjecture through the second moment method.
Submission Type: Full Paper
Supplementary Materials: pdf
0 Replies

Loading