LOW-RANK NETWORKS LEARN HIGH-FREQUENCY FEATURES ON LOSS LANDSCAPE VALLEYS

05 Feb 2026 (modified: 02 Mar 2026)Submitted to Sci4DL 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: low-rank neural networks, random features, feature learning, loss landscape, saddle-to-saddle dynamics.
TL;DR: We present large scale experiments showing that low-rank neural networks learn high-frequency features in stages : the loss has plateaus and sharp valleys for each frequencies to learn.
Abstract: We ask how the geometry of the loss (flat regions, sharp drops, narrow valleys) connects to what the network learns (which frequencies, which channels do what) in a simple setting: low-rank neural networks trained to fit sums of cosines in one dimension. We give three linked results. (1) Channel specialization: we formalize when one "channel" dominates at a given input (a log-ratio criterion) and show experimentally that learning rate and batch size control whether channels specialize or collapse. (2) Oscillatory complexity: we count how many peaks and valleys each channel's output has along the input; this count grows with depth, so deeper layers learn more wiggly, high-frequency structure. (3) Loss shape and frequency: the training loss stays flat for long stretches, then drops sharply; each flat stretch corresponds to a frequency not yet learned, and each drop corresponds to the network entering a narrow valley in the loss landscape. Reducing the learning rate is what allows the optimizer to enter these valleys. We also show that plateau-escape time scales with learning rate and batch size, and that the low-rank structure preserves the symmetry of the target. All experiments use a single, reproducible setup (1D regression, cosine targets, RF-LR architecture); we do not claim the same picture holds beyond this setting.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Style Files: I have used the style files.
Submission Number: 90
Loading