Understanding the Connection between Low-Dimensional Representation and Generalization via Interpolation

24 Sept 2024 (modified: 23 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Low-Dimensional Representation, Interpolation, Generalization
Abstract: In recent years, numerous studies have demonstrated the close connection between neural networks' generalization performance and their ability to learn low-dimensional representations of data. However, the theoretical foundation linking low-dimensional representations to generalization remains underexplored. In this work, we propose a theoretical framework to analyze this relationship from the perspective of interpolation and convex combinations. We argue that lower-dimensional representations increase the likelihood of new samples being expressed as convex combinations of the training set, thereby enhancing interpolation probability. We derive a generalization error upper bound under the interpolation regime, which becomes tighter as the dimensionality of the representation decreases. Furthermore, we investigate how the structure of the manifold affects interpolation probability by examining the volume of the convex hull formed by the manifold. Our theoretical and experimental results show that larger convex hull volumes are associated with higher interpolation probabilities. Additionally, we explore the impact of training data volume on interpolation, finding a significant power-law relationship between increased data volume, convex hull volume and interpolation probability. Overall, this study highlights the critical role of low-dimensional representations in improving the generalization performance of neural networks, supported by both theoretical insights and experimental evidence.
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3848
Loading