Keywords: Theory of Deep Learning, Feature Learning, Hamiltonian
TL;DR: The evolution of the features throughout the layers is described by Hamiltonian dynamics, and feature a low-dim bias.
Abstract: We study Leaky ResNets, which interpolate between ResNets ($\tilde{L}=0$)
and Fully-Connected nets ($\tilde{L}\to\infty$) depending on an 'effective
depth' hyper-parameter $\tilde{L}$. In the infinite depth limit,
we study 'representation geodesics' $A_{p}$: continuous paths in
representation space (similar to NeuralODEs) from input $p=0$ to
output $p=1$ that minimize the parameter norm of the network. We
give a Lagrangian and Hamiltonian reformulation, which highlight the
importance of two terms: a kinetic energy which favors small layer
derivatives $\partial_{p}A_{p}$ and a potential energy that favors
low-dimensional representations, as measured by the 'Cost of Identity'.
The balance between these two forces offers an intuitive understanding
of feature learning in ResNets. We leverage this intuition to explain
the emergence of a bottleneck structure, as observed in previous work:
for large $\tilde{L}$ the potential energy dominates and leads to
a separation of timescales, where the representation jumps rapidly
from the high dimensional inputs to a low-dimensional representation,
move slowly inside the space of low-dimensional representations, before
jumping back to the potentially high-dimensional outputs. Inspired
by this phenomenon, we train with an adaptive layer step-size
to adapt to the separation of timescales.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5165
Loading