The ubiquity of 2-homogeneity, how its implicit bias selects features, and other stories

26 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: generalization, optimization, margins, deep learning theory
TL;DR: The paper establishes low test error and large margin guarantees for general architectures satisfying 2-homogeneity with respect to the outer layers and certain regularity conditions..
Abstract: This work studies the optimization and generalization consequences of a seemingly innocuous design choice in many modern architectures: they end with a composition of affine parameters belonging to a normalization layer and a linear layer, resulting in a fundamentally $2$-homogeneous architecture. The first set of results are abstract, showing how any architecture satisfying this type of 2-homogeneity and a few regularity conditions on the gradients of the inner layers obtain large margins and low test error. As technical byproducts, this part of the story provides an implicitly biased gradient flow guarantee and also a nondecreasing margin lemma for inhomogeneous networks. The second set of results instantiate this framework for shallow normalized ReLU networks, establishing large margin and low test error via feature selection purely from random initialization and standard gradient flow. As a corollary, the paper obtains good test error for $k$-bit parity problems, in particular passing below sample complexity lower bounds from linearized analyses such as the Neural Tangent Kernel.
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6186
Loading