Effect of Local Oscillations on the Scaling Laws of Deep Neural Networks

ICLR 2026 Conference Submission21840 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Scaling laws; local oscillatory complexity; deep neural networks; generalization error; power-law fitting
TL;DR: Local oscillatory complexity affects standard deep learning scaling laws, causing early saturation and altered exponents—smoothness and dimension alone can’t fully explain scaling behavior.
Abstract: Deep neural network (DNN) scaling laws characterize how a model’s performance (e.g. test loss) improves as a function of resources such as training data size, model parameters, or compute. These laws hold for a wide variety of model and data types. Empirical and theoretical results have found that the parameters of the scaling laws depend on aspects of the target data function such as continuity class and dimension. Here we show that another feature of the data, namely the local oscillatory complexity (LOC) of the target function, can dramatically alter scaling behavior. In particular, when the target function is highly oscillatory (parity-like), the drop in loss with more training data becomes shallower. We formalize a metric for local oscillatory complexity and study a family of parity-like target functions where this complexity is controlled by a frequency parameter. We show that high oscillatory complexity can shift the scaling curve upward (higher error floor), change the scaling exponent, and induce an earlier saturation regime. In our experiments, DNNs fail to benefit from additional data when the target function is highly oscillatory. These findings reveal that data continuity class and dimension are insufficient to guarantee standard scaling behavior – LOC must also be accounted for.
Primary Area: learning theory
Submission Number: 21840
Loading